Test Report: Docker_Linux_containerd_arm64 22230

                    
                      c636a8658fdd5cfdd18416b9a30087c97060a836:2025-12-19:42856
                    
                

Test fail (26/321)

Order failed test Duration
99 TestFunctional/parallel/DashboardCmd 23.96
171 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/StartWithProxy 502.35
173 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/SoftStart 367.95
175 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/KubectlGetPods 2.67
185 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmd 2.24
186 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmdDirectly 2.39
187 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ExtraConfig 735.64
188 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ComponentHealth 2.29
191 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/InvalidService 0.06
194 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DashboardCmd 1.77
197 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/StatusCmd 3.11
201 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmdConnect 2.41
203 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim 241.68
213 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NodeLabels 1.41
219 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/RunSecondTunnel 0.46
222 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/WaitService/Setup 0.13
223 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/AccessDirect 97.18
228 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/DeployApp 0.06
229 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/List 0.26
230 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/JSONOutput 0.28
231 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/HTTPS 0.28
232 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/Format 0.26
233 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/URL 0.3
237 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MountCmd/any-port 2.65
358 TestKubernetesUpgrade 800.57
436 TestStartStop/group/default-k8s-diff-port/serial/SecondStart 7200.08
x
+
TestFunctional/parallel/DashboardCmd (23.96s)

                                                
                                                
=== RUN   TestFunctional/parallel/DashboardCmd
=== PAUSE TestFunctional/parallel/DashboardCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/DashboardCmd
functional_test.go:920: (dbg) daemon: [out/minikube-linux-arm64 dashboard --url --port 36195 -p functional-125117 --alsologtostderr -v=1]
functional_test.go:933: output didn't produce a URL
functional_test.go:925: (dbg) stopping [out/minikube-linux-arm64 dashboard --url --port 36195 -p functional-125117 --alsologtostderr -v=1] ...
functional_test.go:925: (dbg) [out/minikube-linux-arm64 dashboard --url --port 36195 -p functional-125117 --alsologtostderr -v=1] stdout:
functional_test.go:925: (dbg) [out/minikube-linux-arm64 dashboard --url --port 36195 -p functional-125117 --alsologtostderr -v=1] stderr:
I1219 05:57:01.123220 2047823 out.go:360] Setting OutFile to fd 1 ...
I1219 05:57:01.124468 2047823 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1219 05:57:01.124487 2047823 out.go:374] Setting ErrFile to fd 2...
I1219 05:57:01.124494 2047823 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1219 05:57:01.124799 2047823 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22230-1998525/.minikube/bin
I1219 05:57:01.125094 2047823 mustload.go:66] Loading cluster: functional-125117
I1219 05:57:01.125526 2047823 config.go:182] Loaded profile config "functional-125117": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
I1219 05:57:01.125977 2047823 cli_runner.go:164] Run: docker container inspect functional-125117 --format={{.State.Status}}
I1219 05:57:01.147675 2047823 host.go:66] Checking if "functional-125117" exists ...
I1219 05:57:01.148075 2047823 cli_runner.go:164] Run: docker system info --format "{{json .}}"
I1219 05:57:01.306980 2047823 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:2 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-19 05:57:01.289346768 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:aa
rch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pa
th:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
I1219 05:57:01.307132 2047823 api_server.go:166] Checking apiserver status ...
I1219 05:57:01.307221 2047823 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
I1219 05:57:01.307290 2047823 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-125117
I1219 05:57:01.388815 2047823 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34699 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-125117/id_rsa Username:docker}
I1219 05:57:01.518989 2047823 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/16555/cgroup
I1219 05:57:01.533269 2047823 api_server.go:182] apiserver freezer: "9:freezer:/docker/2221e6b75bc77fdbcbd5081aa99df3d95513803d491effb7dd928cc5dbb9c46d/kubepods/burstable/podf74c08633a183da1956c6260ce388cbc/1b11907af61cdf1d977c26132016a1b5f9944bc16adc124eeabcf1127edd2cdc"
I1219 05:57:01.533359 2047823 ssh_runner.go:195] Run: sudo cat /sys/fs/cgroup/freezer/docker/2221e6b75bc77fdbcbd5081aa99df3d95513803d491effb7dd928cc5dbb9c46d/kubepods/burstable/podf74c08633a183da1956c6260ce388cbc/1b11907af61cdf1d977c26132016a1b5f9944bc16adc124eeabcf1127edd2cdc/freezer.state
I1219 05:57:01.544048 2047823 api_server.go:204] freezer state: "THAWED"
I1219 05:57:01.544092 2047823 api_server.go:253] Checking apiserver healthz at https://192.168.49.2:8441/healthz ...
I1219 05:57:01.556133 2047823 api_server.go:279] https://192.168.49.2:8441/healthz returned 200:
ok
W1219 05:57:01.556180 2047823 out.go:285] * Enabling dashboard ...
* Enabling dashboard ...
I1219 05:57:01.556370 2047823 config.go:182] Loaded profile config "functional-125117": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
I1219 05:57:01.556382 2047823 addons.go:70] Setting dashboard=true in profile "functional-125117"
I1219 05:57:01.556390 2047823 addons.go:239] Setting addon dashboard=true in "functional-125117"
I1219 05:57:01.556412 2047823 host.go:66] Checking if "functional-125117" exists ...
I1219 05:57:01.556820 2047823 cli_runner.go:164] Run: docker container inspect functional-125117 --format={{.State.Status}}
I1219 05:57:01.633738 2047823 addons.go:436] installing /etc/kubernetes/addons/dashboard-admin.yaml
I1219 05:57:01.633768 2047823 ssh_runner.go:362] scp dashboard/dashboard-admin.yaml --> /etc/kubernetes/addons/dashboard-admin.yaml (373 bytes)
I1219 05:57:01.633864 2047823 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-125117
I1219 05:57:01.746853 2047823 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34699 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-125117/id_rsa Username:docker}
I1219 05:57:01.900887 2047823 ssh_runner.go:195] Run: test -f /usr/bin/helm
I1219 05:57:01.905988 2047823 ssh_runner.go:195] Run: test -f /usr/local/bin/helm
I1219 05:57:01.909300 2047823 ssh_runner.go:195] Run: sudo bash -c "curl -fsSL -o get_helm.sh https://raw.githubusercontent.com/helm/helm/main/scripts/get-helm-3 && chmod 700 get_helm.sh && HELM_INSTALL_DIR=/usr/bin ./get_helm.sh"
I1219 05:57:03.279655 2047823 ssh_runner.go:235] Completed: sudo bash -c "curl -fsSL -o get_helm.sh https://raw.githubusercontent.com/helm/helm/main/scripts/get-helm-3 && chmod 700 get_helm.sh && HELM_INSTALL_DIR=/usr/bin ./get_helm.sh": (1.370319451s)
I1219 05:57:03.279761 2047823 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig helm upgrade --install kubernetes-dashboard kubernetes-dashboard --create-namespace --repo https://kubernetes.github.io/dashboard/ --namespace kubernetes-dashboard --set nginx.enabled=false --set cert-manager.enabled=false --set metrics-server.enabled=false --set kong.proxy.type=NodePort
I1219 05:57:07.131899 2047823 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig helm upgrade --install kubernetes-dashboard kubernetes-dashboard --create-namespace --repo https://kubernetes.github.io/dashboard/ --namespace kubernetes-dashboard --set nginx.enabled=false --set cert-manager.enabled=false --set metrics-server.enabled=false --set kong.proxy.type=NodePort: (3.852099438s)
I1219 05:57:07.131978 2047823 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.34.3/kubectl apply -f /etc/kubernetes/addons/dashboard-admin.yaml
I1219 05:57:07.896151 2047823 addons.go:500] Verifying addon dashboard=true in "functional-125117"
I1219 05:57:07.896474 2047823 cli_runner.go:164] Run: docker container inspect functional-125117 --format={{.State.Status}}
I1219 05:57:07.930293 2047823 out.go:179] * Verifying dashboard addon...
I1219 05:57:07.934228 2047823 kapi.go:59] client config for functional-125117: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-125117/client.crt", KeyFile:"/home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-125117/client.key", CAFile:"/home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil
), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1ffe230), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
I1219 05:57:07.934749 2047823 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
I1219 05:57:07.934762 2047823 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
I1219 05:57:07.934767 2047823 envvar.go:172] "Feature gate default state" feature="ClientsAllowCBOR" enabled=false
I1219 05:57:07.934772 2047823 envvar.go:172] "Feature gate default state" feature="ClientsPreferCBOR" enabled=false
I1219 05:57:07.934780 2047823 envvar.go:172] "Feature gate default state" feature="InOrderInformers" enabled=true
I1219 05:57:07.935013 2047823 kapi.go:75] Waiting for pod with label "app.kubernetes.io/name=kubernetes-dashboard-web" in ns "kubernetes-dashboard" ...
I1219 05:57:07.952545 2047823 kapi.go:86] Found 1 Pods for label selector app.kubernetes.io/name=kubernetes-dashboard-web
I1219 05:57:07.952567 2047823 kapi.go:96] waiting for pod "app.kubernetes.io/name=kubernetes-dashboard-web", current state: Pending: [<nil>]
I1219 05:57:08.440409 2047823 kapi.go:96] waiting for pod "app.kubernetes.io/name=kubernetes-dashboard-web", current state: Pending: [<nil>]
I1219 05:57:08.939508 2047823 kapi.go:96] waiting for pod "app.kubernetes.io/name=kubernetes-dashboard-web", current state: Pending: [<nil>]
I1219 05:57:09.438445 2047823 kapi.go:96] waiting for pod "app.kubernetes.io/name=kubernetes-dashboard-web", current state: Pending: [<nil>]
I1219 05:57:09.939372 2047823 kapi.go:96] waiting for pod "app.kubernetes.io/name=kubernetes-dashboard-web", current state: Pending: [<nil>]
I1219 05:57:10.439513 2047823 kapi.go:96] waiting for pod "app.kubernetes.io/name=kubernetes-dashboard-web", current state: Pending: [<nil>]
I1219 05:57:10.938983 2047823 kapi.go:96] waiting for pod "app.kubernetes.io/name=kubernetes-dashboard-web", current state: Pending: [<nil>]
I1219 05:57:11.439181 2047823 kapi.go:96] waiting for pod "app.kubernetes.io/name=kubernetes-dashboard-web", current state: Pending: [<nil>]
I1219 05:57:11.939078 2047823 kapi.go:96] waiting for pod "app.kubernetes.io/name=kubernetes-dashboard-web", current state: Pending: [<nil>]
I1219 05:57:12.438637 2047823 kapi.go:96] waiting for pod "app.kubernetes.io/name=kubernetes-dashboard-web", current state: Pending: [<nil>]
I1219 05:57:12.942333 2047823 kapi.go:96] waiting for pod "app.kubernetes.io/name=kubernetes-dashboard-web", current state: Pending: [<nil>]
I1219 05:57:13.439043 2047823 kapi.go:96] waiting for pod "app.kubernetes.io/name=kubernetes-dashboard-web", current state: Pending: [<nil>]
I1219 05:57:13.938665 2047823 kapi.go:96] waiting for pod "app.kubernetes.io/name=kubernetes-dashboard-web", current state: Pending: [<nil>]
I1219 05:57:14.437922 2047823 kapi.go:96] waiting for pod "app.kubernetes.io/name=kubernetes-dashboard-web", current state: Pending: [<nil>]
I1219 05:57:14.938117 2047823 kapi.go:96] waiting for pod "app.kubernetes.io/name=kubernetes-dashboard-web", current state: Pending: [<nil>]
I1219 05:57:15.438989 2047823 kapi.go:96] waiting for pod "app.kubernetes.io/name=kubernetes-dashboard-web", current state: Pending: [<nil>]
I1219 05:57:15.948839 2047823 kapi.go:96] waiting for pod "app.kubernetes.io/name=kubernetes-dashboard-web", current state: Pending: [<nil>]
I1219 05:57:16.438928 2047823 kapi.go:96] waiting for pod "app.kubernetes.io/name=kubernetes-dashboard-web", current state: Pending: [<nil>]
I1219 05:57:16.945365 2047823 kapi.go:96] waiting for pod "app.kubernetes.io/name=kubernetes-dashboard-web", current state: Pending: [<nil>]
I1219 05:57:17.439923 2047823 kapi.go:96] waiting for pod "app.kubernetes.io/name=kubernetes-dashboard-web", current state: Pending: [<nil>]
I1219 05:57:17.939197 2047823 kapi.go:96] waiting for pod "app.kubernetes.io/name=kubernetes-dashboard-web", current state: Pending: [<nil>]
I1219 05:57:18.439418 2047823 kapi.go:96] waiting for pod "app.kubernetes.io/name=kubernetes-dashboard-web", current state: Pending: [<nil>]
I1219 05:57:18.938952 2047823 kapi.go:96] waiting for pod "app.kubernetes.io/name=kubernetes-dashboard-web", current state: Pending: [<nil>]
I1219 05:57:19.438480 2047823 kapi.go:96] waiting for pod "app.kubernetes.io/name=kubernetes-dashboard-web", current state: Pending: [<nil>]
I1219 05:57:19.939179 2047823 kapi.go:96] waiting for pod "app.kubernetes.io/name=kubernetes-dashboard-web", current state: Pending: [<nil>]
I1219 05:57:20.438694 2047823 kapi.go:96] waiting for pod "app.kubernetes.io/name=kubernetes-dashboard-web", current state: Pending: [<nil>]
I1219 05:57:20.938112 2047823 kapi.go:96] waiting for pod "app.kubernetes.io/name=kubernetes-dashboard-web", current state: Pending: [<nil>]
I1219 05:57:21.440646 2047823 kapi.go:96] waiting for pod "app.kubernetes.io/name=kubernetes-dashboard-web", current state: Pending: [<nil>]
I1219 05:57:21.938655 2047823 kapi.go:107] duration metric: took 14.00364192s to wait for app.kubernetes.io/name=kubernetes-dashboard-web ...
I1219 05:57:21.942197 2047823 out.go:179] * Some dashboard features require the metrics-server addon. To enable all features please run:

                                                
                                                
	minikube -p functional-125117 addons enable metrics-server

                                                
                                                
I1219 05:57:21.945176 2047823 addons.go:202] Writing out "functional-125117" config to set dashboard=true...
W1219 05:57:21.945441 2047823 out.go:285] * Verifying dashboard health ...
* Verifying dashboard health ...
I1219 05:57:21.945947 2047823 kapi.go:59] client config for functional-125117: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-125117/client.crt", KeyFile:"/home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-125117/client.key", CAFile:"/home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil
), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1ffe230), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
I1219 05:57:21.948688 2047823 service.go:215] Found service: &Service{ObjectMeta:{kubernetes-dashboard-kong-proxy  kubernetes-dashboard  bf7ddeff-9858-4aa5-85e6-f98ffb60765b 704 0 2025-12-19 05:57:06 +0000 UTC <nil> <nil> map[app.kubernetes.io/instance:kubernetes-dashboard app.kubernetes.io/managed-by:Helm app.kubernetes.io/name:kong app.kubernetes.io/version:3.9 enable-metrics:true helm.sh/chart:kong-2.52.0] map[meta.helm.sh/release-name:kubernetes-dashboard meta.helm.sh/release-namespace:kubernetes-dashboard] [] [] [{helm Update v1 2025-12-19 05:57:06 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{".":{},"f:meta.helm.sh/release-name":{},"f:meta.helm.sh/release-namespace":{}},"f:labels":{".":{},"f:app.kubernetes.io/instance":{},"f:app.kubernetes.io/managed-by":{},"f:app.kubernetes.io/name":{},"f:app.kubernetes.io/version":{},"f:enable-metrics":{},"f:helm.sh/chart":{}}},"f:spec":{"f:externalTrafficPolicy":{},"f:internalTrafficPolicy":{},"f:ports":{".":{},"k:{\"port\":443,\"protocol\":\"TCP\"}":{".
":{},"f:name":{},"f:port":{},"f:protocol":{},"f:targetPort":{}}},"f:selector":{},"f:sessionAffinity":{},"f:type":{}}} }]},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:kong-proxy-tls,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:32145,AppProtocol:nil,},},Selector:map[string]string{app.kubernetes.io/component: app,app.kubernetes.io/instance: kubernetes-dashboard,app.kubernetes.io/name: kong,},ClusterIP:10.98.222.17,Type:NodePort,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:Cluster,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.98.222.17],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}
I1219 05:57:21.948938 2047823 host.go:66] Checking if "functional-125117" exists ...
I1219 05:57:21.949222 2047823 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-125117
I1219 05:57:21.978952 2047823 kapi.go:59] client config for functional-125117: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-125117/client.crt", KeyFile:"/home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-125117/client.key", CAFile:"/home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil
), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1ffe230), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
I1219 05:57:21.986860 2047823 warnings.go:110] "Warning: v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice"
I1219 05:57:21.990722 2047823 warnings.go:110] "Warning: v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice"
I1219 05:57:21.994388 2047823 warnings.go:110] "Warning: v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice"
I1219 05:57:21.998270 2047823 warnings.go:110] "Warning: v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice"
I1219 05:57:22.182709 2047823 warnings.go:110] "Warning: v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice"
I1219 05:57:22.291030 2047823 out.go:179] * Dashboard Token:
I1219 05:57:22.293919 2047823 out.go:203] eyJhbGciOiJSUzI1NiIsImtpZCI6InlGUWpxMm1XaHJGQkJ1TEE5ci10cWpmMzR3R1lrR093bHFDYWQ1cTRJQ2cifQ.eyJhdWQiOlsiaHR0cHM6Ly9rdWJlcm5ldGVzLmRlZmF1bHQuc3ZjLmNsdXN0ZXIubG9jYWwiXSwiZXhwIjoxNzY2MjEwMjQyLCJpYXQiOjE3NjYxMjM4NDIsImlzcyI6Imh0dHBzOi8va3ViZXJuZXRlcy5kZWZhdWx0LnN2Yy5jbHVzdGVyLmxvY2FsIiwianRpIjoiYzc2MzExMmYtOTdlMC00MGY0LTg3ZDktNWY1MTdkMzM5ODE5Iiwia3ViZXJuZXRlcy5pbyI6eyJuYW1lc3BhY2UiOiJrdWJlcm5ldGVzLWRhc2hib2FyZCIsInNlcnZpY2VhY2NvdW50Ijp7Im5hbWUiOiJhZG1pbi11c2VyIiwidWlkIjoiMGRkODI3MDgtM2MwZC00Y2NjLTg1YzEtZTNiZDMyZGRlMTA2In19LCJuYmYiOjE3NjYxMjM4NDIsInN1YiI6InN5c3RlbTpzZXJ2aWNlYWNjb3VudDprdWJlcm5ldGVzLWRhc2hib2FyZDphZG1pbi11c2VyIn0.ruDSRWIQMH7IV_Y7WKR_D6vmiRjcH9r7nfTAYRpqcKjHDXWo5P84FpyrYNCdMiuPaEF17vorwJLOysRHH7LM18poxYVuAyZUBHH4VHpn1KHxlFGnyuIlybM4PBwYlj7YO2IhlKf0_NJ9F364cGMxZdd0AwnuC_u_bU3peR6lk9UHP8owmsSk113BCXcaSiq-ZGzgnK8EvtUQM7PqZwmM2r0aJT6-nQySJGBE0CEPW2TXRHdLI8zn4d05NoI4cUH9itpvcXCGgEjTdPDcXdqAYHetSlC8ODpCfhJ0xDiKbbzOptcGHMFFr7FCq-mWnqd19IqQoHWnBtl9CjbgMNvxPQ
I1219 05:57:22.297036 2047823 out.go:203] https://192.168.49.2:32145
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:224: ======>  post-mortem[TestFunctional/parallel/DashboardCmd]: network settings <======
helpers_test.go:231: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:239: ======>  post-mortem[TestFunctional/parallel/DashboardCmd]: docker inspect <======
helpers_test.go:240: (dbg) Run:  docker inspect functional-125117
helpers_test.go:244: (dbg) docker inspect functional-125117:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "2221e6b75bc77fdbcbd5081aa99df3d95513803d491effb7dd928cc5dbb9c46d",
	        "Created": "2025-12-19T05:50:21.175219791Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 2024795,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-19T05:50:21.237188659Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:1411dfa4fea1291ce69fcd55acb99f3fbff3e701cee30fdd4f0b2561ac0ef6b0",
	        "ResolvConfPath": "/var/lib/docker/containers/2221e6b75bc77fdbcbd5081aa99df3d95513803d491effb7dd928cc5dbb9c46d/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/2221e6b75bc77fdbcbd5081aa99df3d95513803d491effb7dd928cc5dbb9c46d/hostname",
	        "HostsPath": "/var/lib/docker/containers/2221e6b75bc77fdbcbd5081aa99df3d95513803d491effb7dd928cc5dbb9c46d/hosts",
	        "LogPath": "/var/lib/docker/containers/2221e6b75bc77fdbcbd5081aa99df3d95513803d491effb7dd928cc5dbb9c46d/2221e6b75bc77fdbcbd5081aa99df3d95513803d491effb7dd928cc5dbb9c46d-json.log",
	        "Name": "/functional-125117",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-125117:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-125117",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "2221e6b75bc77fdbcbd5081aa99df3d95513803d491effb7dd928cc5dbb9c46d",
	                "LowerDir": "/var/lib/docker/overlay2/fec91f26491afaab5012266e19a5baf181492c8d37d8a523a0279dcb6bb7b60e-init/diff:/var/lib/docker/overlay2/00358d85eab3b52f9d297862c5ac97673efd866f7bb8f8781bf0c1744f50abc5/diff",
	                "MergedDir": "/var/lib/docker/overlay2/fec91f26491afaab5012266e19a5baf181492c8d37d8a523a0279dcb6bb7b60e/merged",
	                "UpperDir": "/var/lib/docker/overlay2/fec91f26491afaab5012266e19a5baf181492c8d37d8a523a0279dcb6bb7b60e/diff",
	                "WorkDir": "/var/lib/docker/overlay2/fec91f26491afaab5012266e19a5baf181492c8d37d8a523a0279dcb6bb7b60e/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "volume",
	                "Name": "functional-125117",
	                "Source": "/var/lib/docker/volumes/functional-125117/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            },
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-125117",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-125117",
	                "name.minikube.sigs.k8s.io": "functional-125117",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "ac6be209147ff83ef33be0404c30a3d52883a893b906849415e17071827a7832",
	            "SandboxKey": "/var/run/docker/netns/ac6be209147f",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34699"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34700"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34703"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34701"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34702"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-125117": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "ea:33:98:b2:6d:ff",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "6e3ca32369972a40f97c88e40ef8ebd8b3faecd38cd192c4fdab30ed0f8e624a",
	                    "EndpointID": "d8731576aa6adeabdd47ddb63b5b54548f6275ebd89ddc275bf6ea852e3f8f98",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-125117",
	                        "2221e6b75bc7"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:248: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-125117 -n functional-125117
helpers_test.go:253: <<< TestFunctional/parallel/DashboardCmd FAILED: start of post-mortem logs <<<
helpers_test.go:254: ======>  post-mortem[TestFunctional/parallel/DashboardCmd]: minikube logs <======
helpers_test.go:256: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 logs -n 25
helpers_test.go:256: (dbg) Done: out/minikube-linux-arm64 -p functional-125117 logs -n 25: (1.605495152s)
helpers_test.go:261: TestFunctional/parallel/DashboardCmd logs: 
-- stdout --
	
	==> Audit <==
	┌────────────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│    COMMAND     │                                                                              ARGS                                                                               │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├────────────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ image          │ functional-125117 image load --daemon kicbase/echo-server:functional-125117 --alsologtostderr                                                                   │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image ls                                                                                                                                      │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image save kicbase/echo-server:functional-125117 /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image rm kicbase/echo-server:functional-125117 --alsologtostderr                                                                              │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image ls                                                                                                                                      │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image load /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr                                       │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image ls                                                                                                                                      │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image save --daemon kicbase/echo-server:functional-125117 --alsologtostderr                                                                   │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ ssh            │ functional-125117 ssh sudo cat /etc/ssl/certs/2000386.pem                                                                                                       │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ ssh            │ functional-125117 ssh sudo cat /usr/share/ca-certificates/2000386.pem                                                                                           │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ ssh            │ functional-125117 ssh sudo cat /etc/ssl/certs/51391683.0                                                                                                        │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ ssh            │ functional-125117 ssh sudo cat /etc/ssl/certs/20003862.pem                                                                                                      │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ ssh            │ functional-125117 ssh sudo cat /usr/share/ca-certificates/20003862.pem                                                                                          │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ ssh            │ functional-125117 ssh sudo cat /etc/ssl/certs/3ec20f2e.0                                                                                                        │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ ssh            │ functional-125117 ssh sudo cat /etc/test/nested/copy/2000386/hosts                                                                                              │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image ls --format short --alsologtostderr                                                                                                     │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image ls --format yaml --alsologtostderr                                                                                                      │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ ssh            │ functional-125117 ssh pgrep buildkitd                                                                                                                           │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │                     │
	│ image          │ functional-125117 image build -t localhost/my-image:functional-125117 testdata/build --alsologtostderr                                                          │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image ls                                                                                                                                      │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image ls --format json --alsologtostderr                                                                                                      │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image ls --format table --alsologtostderr                                                                                                     │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ update-context │ functional-125117 update-context --alsologtostderr -v=2                                                                                                         │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ update-context │ functional-125117 update-context --alsologtostderr -v=2                                                                                                         │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ update-context │ functional-125117 update-context --alsologtostderr -v=2                                                                                                         │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	└────────────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/19 05:57:00
	Running on machine: ip-172-31-30-239
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1219 05:57:00.825630 2047702 out.go:360] Setting OutFile to fd 1 ...
	I1219 05:57:00.825786 2047702 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 05:57:00.825814 2047702 out.go:374] Setting ErrFile to fd 2...
	I1219 05:57:00.825821 2047702 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 05:57:00.826235 2047702 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22230-1998525/.minikube/bin
	I1219 05:57:00.826818 2047702 out.go:368] Setting JSON to false
	I1219 05:57:00.828353 2047702 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":38367,"bootTime":1766085454,"procs":205,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1219 05:57:00.828503 2047702 start.go:143] virtualization:  
	I1219 05:57:00.831878 2047702 out.go:179] * [functional-125117] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1219 05:57:00.835744 2047702 out.go:179]   - MINIKUBE_LOCATION=22230
	I1219 05:57:00.835808 2047702 notify.go:221] Checking for updates...
	I1219 05:57:00.842106 2047702 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1219 05:57:00.844921 2047702 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22230-1998525/kubeconfig
	I1219 05:57:00.847807 2047702 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22230-1998525/.minikube
	I1219 05:57:00.850648 2047702 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1219 05:57:00.853481 2047702 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1219 05:57:00.856743 2047702 config.go:182] Loaded profile config "functional-125117": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
	I1219 05:57:00.857481 2047702 driver.go:422] Setting default libvirt URI to qemu:///system
	I1219 05:57:00.887079 2047702 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1219 05:57:00.887809 2047702 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1219 05:57:00.981174 2047702 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:2 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-19 05:57:00.972012169 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1219 05:57:00.981280 2047702 docker.go:319] overlay module found
	I1219 05:57:00.984383 2047702 out.go:179] * Using the docker driver based on existing profile
	I1219 05:57:00.987320 2047702 start.go:309] selected driver: docker
	I1219 05:57:00.987344 2047702 start.go:928] validating driver "docker" against &{Name:functional-125117 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:functional-125117 Namespace:default APIServerHAVIP: APIServerNa
me:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.34.3 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOpt
ions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1219 05:57:00.987457 2047702 start.go:939] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1219 05:57:00.987563 2047702 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1219 05:57:01.066248 2047702 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:2 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-19 05:57:01.056970814 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1219 05:57:01.066699 2047702 cni.go:84] Creating CNI manager for ""
	I1219 05:57:01.066777 2047702 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1219 05:57:01.066826 2047702 start.go:353] cluster config:
	{Name:functional-125117 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:functional-125117 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Containe
rRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.34.3 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizati
ons:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1219 05:57:01.071671 2047702 out.go:179] * dry-run validation complete!
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED                  STATE               NAME                                   ATTEMPT             POD ID              POD                                                     NAMESPACE
	d56bac3d8d52f       8dcebcf593999       Less than a second ago   Running             kubernetes-dashboard-auth              0                   0225e8cd2103b       kubernetes-dashboard-auth-55fb9bbdf8-4b2hl              kubernetes-dashboard
	29f9704c03ef9       d71ba84d8f0d2       1 second ago             Running             kubernetes-dashboard-metrics-scraper   0                   c275930c27ba8       kubernetes-dashboard-metrics-scraper-7685fd8b77-nrkjq   kubernetes-dashboard
	985ee70cda830       2c51e8aea46c6       2 seconds ago            Running             kubernetes-dashboard-web               0                   60b553c003886       kubernetes-dashboard-web-5c9f966b98-wllqv               kubernetes-dashboard
	89d78fe89c73a       85ac4c11285e7       6 seconds ago            Running             kubernetes-dashboard-api               0                   abe31bffd8fd0       kubernetes-dashboard-api-5f6dd64f4-wz2l2                kubernetes-dashboard
	f570c4aeb8c9e       2bf86f243d250       6 seconds ago            Running             proxy                                  0                   89848159ab3b4       kubernetes-dashboard-kong-9849c64bd-527k2               kubernetes-dashboard
	fa783f0689484       2bf86f243d250       8 seconds ago            Exited              clear-stale-pid                        0                   89848159ab3b4       kubernetes-dashboard-kong-9849c64bd-527k2               kubernetes-dashboard
	b1ee8306f19fd       1611cd07b61d5       28 seconds ago           Exited              mount-munger                           0                   daa6c1e249943       busybox-mount                                           default
	df159e0cc3b97       ce2d2cda2d858       35 seconds ago           Running             echo-server                            0                   34b2c94f7f2c5       hello-node-75c85bcc94-b9fdv                             default
	823a7eff1bd3b       962dbbc0e55ec       40 seconds ago           Running             myfrontend                             0                   2a2ea65afd873       sp-pod                                                  default
	6dab832abda4b       ce2d2cda2d858       44 seconds ago           Running             echo-server                            0                   e5e35ac64f393       hello-node-connect-7d85dfc575-fknbj                     default
	a9b205fafc432       962dbbc0e55ec       51 seconds ago           Running             nginx                                  0                   c99225971f165       nginx-svc                                               default
	2020f80b4026c       138784d87c9c5       About a minute ago       Running             coredns                                0                   4aae0884bc674       coredns-66bc5c9577-s2j74                                kube-system
	1695d1b349c32       c96ee3c174987       About a minute ago       Running             kindnet-cni                            0                   97f4d0f17f59a       kindnet-xm479                                           kube-system
	be5400fb900ed       138784d87c9c5       About a minute ago       Running             coredns                                0                   bfb2d9a2f2bc6       coredns-66bc5c9577-jvgjh                                kube-system
	616e08daf003a       4461daf6b6af8       About a minute ago       Running             kube-proxy                             0                   45c23361c46b4       kube-proxy-rglch                                        kube-system
	43e1c32c12ad7       ba04bb24b9575       About a minute ago       Running             storage-provisioner                    0                   9a73a03433d65       storage-provisioner                                     kube-system
	80f5fc9caf3e4       2f2aa21d34d2d       About a minute ago       Running             kube-scheduler                         2                   812738176915b       kube-scheduler-functional-125117                        kube-system
	b4205502d1709       7ada8ff13e54b       About a minute ago       Running             kube-controller-manager                7                   2138fcdd1893b       kube-controller-manager-functional-125117               kube-system
	1b11907af61cd       cf65ae6c8f700       About a minute ago       Running             kube-apiserver                         0                   3552ec6bd3315       kube-apiserver-functional-125117                        kube-system
	e6ec1e0e37ba6       2c5f0dedd21c2       About a minute ago       Running             etcd                                   2                   2bef43bf72df8       etcd-functional-125117                                  kube-system
	
	
	==> containerd <==
	Dec 19 05:57:22 functional-125117 containerd[3578]: time="2025-12-19T05:57:22.167396578Z" level=info msg="ImageCreate event name:\"docker.io/kubernetesui/dashboard-metrics-scraper:1.2.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 19 05:57:22 functional-125117 containerd[3578]: time="2025-12-19T05:57:22.169817863Z" level=info msg="stop pulling image docker.io/kubernetesui/dashboard-metrics-scraper:1.2.2: active requests=0, bytes read=11728895"
	Dec 19 05:57:22 functional-125117 containerd[3578]: time="2025-12-19T05:57:22.171669558Z" level=info msg="ImageCreate event name:\"sha256:d71ba84d8f0d22f4859613e1a4cc4636910142305ead3b53d2acaec6b69833da\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 19 05:57:22 functional-125117 containerd[3578]: time="2025-12-19T05:57:22.194257564Z" level=info msg="ImageCreate event name:\"docker.io/kubernetesui/dashboard-metrics-scraper@sha256:5154b68252bd601cf85092b6413cb9db224af1ef89cb53009d2070dfccd30775\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 19 05:57:22 functional-125117 containerd[3578]: time="2025-12-19T05:57:22.195886217Z" level=info msg="Pulled image \"docker.io/kubernetesui/dashboard-metrics-scraper:1.2.2\" with image id \"sha256:d71ba84d8f0d22f4859613e1a4cc4636910142305ead3b53d2acaec6b69833da\", repo tag \"docker.io/kubernetesui/dashboard-metrics-scraper:1.2.2\", repo digest \"docker.io/kubernetesui/dashboard-metrics-scraper@sha256:5154b68252bd601cf85092b6413cb9db224af1ef89cb53009d2070dfccd30775\", size \"11717950\" in 1.07081575s"
	Dec 19 05:57:22 functional-125117 containerd[3578]: time="2025-12-19T05:57:22.196049780Z" level=info msg="PullImage \"docker.io/kubernetesui/dashboard-metrics-scraper:1.2.2\" returns image reference \"sha256:d71ba84d8f0d22f4859613e1a4cc4636910142305ead3b53d2acaec6b69833da\""
	Dec 19 05:57:22 functional-125117 containerd[3578]: time="2025-12-19T05:57:22.201130802Z" level=info msg="PullImage \"docker.io/kubernetesui/dashboard-auth:1.4.0\""
	Dec 19 05:57:22 functional-125117 containerd[3578]: time="2025-12-19T05:57:22.210945248Z" level=info msg="CreateContainer within sandbox \"c275930c27ba8d726b0adf87ed3f69c62de8f8ed2f08671e6337766cd7f05237\" for container name:\"kubernetes-dashboard-metrics-scraper\""
	Dec 19 05:57:22 functional-125117 containerd[3578]: time="2025-12-19T05:57:22.233610268Z" level=info msg="Container 29f9704c03ef922b8caa0b1f4f2a3bfcd7eb0990250215e8c3580c91d26110d8: CDI devices from CRI Config.CDIDevices: []"
	Dec 19 05:57:22 functional-125117 containerd[3578]: time="2025-12-19T05:57:22.248938438Z" level=info msg="CreateContainer within sandbox \"c275930c27ba8d726b0adf87ed3f69c62de8f8ed2f08671e6337766cd7f05237\" for name:\"kubernetes-dashboard-metrics-scraper\" returns container id \"29f9704c03ef922b8caa0b1f4f2a3bfcd7eb0990250215e8c3580c91d26110d8\""
	Dec 19 05:57:22 functional-125117 containerd[3578]: time="2025-12-19T05:57:22.257839193Z" level=info msg="StartContainer for \"29f9704c03ef922b8caa0b1f4f2a3bfcd7eb0990250215e8c3580c91d26110d8\""
	Dec 19 05:57:22 functional-125117 containerd[3578]: time="2025-12-19T05:57:22.259296932Z" level=info msg="connecting to shim 29f9704c03ef922b8caa0b1f4f2a3bfcd7eb0990250215e8c3580c91d26110d8" address="unix:///run/containerd/s/0a628d2c35dcad9572d80623ad279f478ea81c9dd11caf27b49cdb71900e28b6" protocol=ttrpc version=3
	Dec 19 05:57:22 functional-125117 containerd[3578]: time="2025-12-19T05:57:22.371222425Z" level=info msg="StartContainer for \"29f9704c03ef922b8caa0b1f4f2a3bfcd7eb0990250215e8c3580c91d26110d8\" returns successfully"
	Dec 19 05:57:23 functional-125117 containerd[3578]: time="2025-12-19T05:57:23.394601061Z" level=info msg="ImageCreate event name:\"docker.io/kubernetesui/dashboard-auth:1.4.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 19 05:57:23 functional-125117 containerd[3578]: time="2025-12-19T05:57:23.396950640Z" level=info msg="stop pulling image docker.io/kubernetesui/dashboard-auth:1.4.0: active requests=0, bytes read=13100028"
	Dec 19 05:57:23 functional-125117 containerd[3578]: time="2025-12-19T05:57:23.400038862Z" level=info msg="ImageCreate event name:\"sha256:8dcebcf59399969d7300450ebd4b47f1b8f1ba30453e65ee77c6fb59fb27550c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 19 05:57:23 functional-125117 containerd[3578]: time="2025-12-19T05:57:23.410832166Z" level=info msg="ImageCreate event name:\"docker.io/kubernetesui/dashboard-auth@sha256:53e9917898bf98ff2de91f7f9bdedd3545780eb3ac72158889ae031136e9eeff\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 19 05:57:23 functional-125117 containerd[3578]: time="2025-12-19T05:57:23.412979791Z" level=info msg="Pulled image \"docker.io/kubernetesui/dashboard-auth:1.4.0\" with image id \"sha256:8dcebcf59399969d7300450ebd4b47f1b8f1ba30453e65ee77c6fb59fb27550c\", repo tag \"docker.io/kubernetesui/dashboard-auth:1.4.0\", repo digest \"docker.io/kubernetesui/dashboard-auth@sha256:53e9917898bf98ff2de91f7f9bdedd3545780eb3ac72158889ae031136e9eeff\", size \"13089138\" in 1.211778473s"
	Dec 19 05:57:23 functional-125117 containerd[3578]: time="2025-12-19T05:57:23.413188827Z" level=info msg="PullImage \"docker.io/kubernetesui/dashboard-auth:1.4.0\" returns image reference \"sha256:8dcebcf59399969d7300450ebd4b47f1b8f1ba30453e65ee77c6fb59fb27550c\""
	Dec 19 05:57:23 functional-125117 containerd[3578]: time="2025-12-19T05:57:23.425320886Z" level=info msg="CreateContainer within sandbox \"0225e8cd2103bde4efe107c7649c2155e9339812476b4c1d8cd89c1a9deebecb\" for container name:\"kubernetes-dashboard-auth\""
	Dec 19 05:57:23 functional-125117 containerd[3578]: time="2025-12-19T05:57:23.444087156Z" level=info msg="Container d56bac3d8d52fba6a6ff205aef02077569096618ca3b77f900e4c2ac7d44d3b9: CDI devices from CRI Config.CDIDevices: []"
	Dec 19 05:57:23 functional-125117 containerd[3578]: time="2025-12-19T05:57:23.460948282Z" level=info msg="CreateContainer within sandbox \"0225e8cd2103bde4efe107c7649c2155e9339812476b4c1d8cd89c1a9deebecb\" for name:\"kubernetes-dashboard-auth\" returns container id \"d56bac3d8d52fba6a6ff205aef02077569096618ca3b77f900e4c2ac7d44d3b9\""
	Dec 19 05:57:23 functional-125117 containerd[3578]: time="2025-12-19T05:57:23.464948924Z" level=info msg="StartContainer for \"d56bac3d8d52fba6a6ff205aef02077569096618ca3b77f900e4c2ac7d44d3b9\""
	Dec 19 05:57:23 functional-125117 containerd[3578]: time="2025-12-19T05:57:23.468856214Z" level=info msg="connecting to shim d56bac3d8d52fba6a6ff205aef02077569096618ca3b77f900e4c2ac7d44d3b9" address="unix:///run/containerd/s/fc057acea2e6c86341088ee862dc630bf98085dfd46b41b5093ed8336bfefa4b" protocol=ttrpc version=3
	Dec 19 05:57:23 functional-125117 containerd[3578]: time="2025-12-19T05:57:23.562789533Z" level=info msg="StartContainer for \"d56bac3d8d52fba6a6ff205aef02077569096618ca3b77f900e4c2ac7d44d3b9\" returns successfully"
	
	
	==> coredns [2020f80b4026ca383eb5916648bf5f061d3ca523aff9553104589e45655d43d2] <==
	maxprocs: Leaving GOMAXPROCS=2: CPU quota undefined
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 1b226df79860026c6a52e67daa10d7f0d57ec5b023288ec00c5e05f93523c894564e15b91770d3a07ae1cfbe861d15b37d4a0027e69c546ab112970993a3b03b
	CoreDNS-1.12.1
	linux/arm64, go1.24.1, 707c7c1
	
	
	==> coredns [be5400fb900edbe189897a7de08048ea310120f2147e220b70ef98b37ddd44bc] <==
	maxprocs: Leaving GOMAXPROCS=2: CPU quota undefined
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 1b226df79860026c6a52e67daa10d7f0d57ec5b023288ec00c5e05f93523c894564e15b91770d3a07ae1cfbe861d15b37d4a0027e69c546ab112970993a3b03b
	CoreDNS-1.12.1
	linux/arm64, go1.24.1, 707c7c1
	
	
	==> describe nodes <==
	Name:               functional-125117
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=arm64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=arm64
	                    kubernetes.io/hostname=functional-125117
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=d7bd998f643f77295f2e0ab31c763be310dbe1a6
	                    minikube.k8s.io/name=functional-125117
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2025_12_19T05_56_07_0700
	                    minikube.k8s.io/version=v1.37.0
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Fri, 19 Dec 2025 05:56:03 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  functional-125117
	  AcquireTime:     <unset>
	  RenewTime:       Fri, 19 Dec 2025 05:57:17 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Fri, 19 Dec 2025 05:57:07 +0000   Fri, 19 Dec 2025 05:55:58 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Fri, 19 Dec 2025 05:57:07 +0000   Fri, 19 Dec 2025 05:55:58 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Fri, 19 Dec 2025 05:57:07 +0000   Fri, 19 Dec 2025 05:55:58 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Fri, 19 Dec 2025 05:57:07 +0000   Fri, 19 Dec 2025 05:56:03 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.168.49.2
	  Hostname:    functional-125117
	Capacity:
	  cpu:                2
	  ephemeral-storage:  203034800Ki
	  hugepages-1Gi:      0
	  hugepages-2Mi:      0
	  hugepages-32Mi:     0
	  hugepages-64Ki:     0
	  memory:             8022300Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  203034800Ki
	  hugepages-1Gi:      0
	  hugepages-2Mi:      0
	  hugepages-32Mi:     0
	  hugepages-64Ki:     0
	  memory:             8022300Ki
	  pods:               110
	System Info:
	  Machine ID:                 02ff784b806e34735a6e229a69428228
	  System UUID:                3f17a1b0-8787-4e17-96fe-db8f8d00c153
	  Boot ID:                    03591113-7af0-4522-8acc-d2a56f93f0cf
	  Kernel Version:             5.15.0-1084-aws
	  OS Image:                   Debian GNU/Linux 12 (bookworm)
	  Operating System:           linux
	  Architecture:               arm64
	  Container Runtime Version:  containerd://2.2.0
	  Kubelet Version:            v1.34.3
	  Kube-Proxy Version:         
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (18 in total)
	  Namespace                   Name                                                     CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                                     ------------  ----------  ---------------  -------------  ---
	  default                     hello-node-75c85bcc94-b9fdv                              0 (0%)        0 (0%)      0 (0%)           0 (0%)         36s
	  default                     hello-node-connect-7d85dfc575-fknbj                      0 (0%)        0 (0%)      0 (0%)           0 (0%)         45s
	  default                     nginx-svc                                                0 (0%)        0 (0%)      0 (0%)           0 (0%)         54s
	  default                     sp-pod                                                   0 (0%)        0 (0%)      0 (0%)           0 (0%)         41s
	  kube-system                 coredns-66bc5c9577-jvgjh                                 100m (5%)     0 (0%)      70Mi (0%)        170Mi (2%)     72s
	  kube-system                 coredns-66bc5c9577-s2j74                                 100m (5%)     0 (0%)      70Mi (0%)        170Mi (2%)     72s
	  kube-system                 etcd-functional-125117                                   100m (5%)     0 (0%)      100Mi (1%)       0 (0%)         77s
	  kube-system                 kindnet-xm479                                            100m (5%)     100m (5%)   50Mi (0%)        50Mi (0%)      72s
	  kube-system                 kube-apiserver-functional-125117                         250m (12%)    0 (0%)      0 (0%)           0 (0%)         77s
	  kube-system                 kube-controller-manager-functional-125117                200m (10%)    0 (0%)      0 (0%)           0 (0%)         77s
	  kube-system                 kube-proxy-rglch                                         0 (0%)        0 (0%)      0 (0%)           0 (0%)         72s
	  kube-system                 kube-scheduler-functional-125117                         100m (5%)     0 (0%)      0 (0%)           0 (0%)         78s
	  kube-system                 storage-provisioner                                      0 (0%)        0 (0%)      0 (0%)           0 (0%)         71s
	  kubernetes-dashboard        kubernetes-dashboard-api-5f6dd64f4-wz2l2                 100m (5%)     250m (12%)  200Mi (2%)       400Mi (5%)     16s
	  kubernetes-dashboard        kubernetes-dashboard-auth-55fb9bbdf8-4b2hl               100m (5%)     250m (12%)  200Mi (2%)       400Mi (5%)     16s
	  kubernetes-dashboard        kubernetes-dashboard-kong-9849c64bd-527k2                0 (0%)        0 (0%)      0 (0%)           0 (0%)         16s
	  kubernetes-dashboard        kubernetes-dashboard-metrics-scraper-7685fd8b77-nrkjq    100m (5%)     250m (12%)  200Mi (2%)       400Mi (5%)     16s
	  kubernetes-dashboard        kubernetes-dashboard-web-5c9f966b98-wllqv                100m (5%)     250m (12%)  200Mi (2%)       400Mi (5%)     16s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests      Limits
	  --------           --------      ------
	  cpu                1350m (67%)   1100m (55%)
	  memory             1090Mi (13%)  1990Mi (25%)
	  ephemeral-storage  0 (0%)        0 (0%)
	  hugepages-1Gi      0 (0%)        0 (0%)
	  hugepages-2Mi      0 (0%)        0 (0%)
	  hugepages-32Mi     0 (0%)        0 (0%)
	  hugepages-64Ki     0 (0%)        0 (0%)
	Events:
	  Type     Reason                   Age                From             Message
	  ----     ------                   ----               ----             -------
	  Normal   Starting                 69s                kube-proxy       
	  Normal   NodeAllocatableEnforced  86s                kubelet          Updated Node Allocatable limit across pods
	  Warning  CgroupV1                 86s                kubelet          cgroup v1 support is in maintenance mode, please migrate to cgroup v2
	  Normal   NodeHasSufficientMemory  86s (x8 over 86s)  kubelet          Node functional-125117 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    86s (x8 over 86s)  kubelet          Node functional-125117 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     86s (x7 over 86s)  kubelet          Node functional-125117 status is now: NodeHasSufficientPID
	  Normal   Starting                 86s                kubelet          Starting kubelet.
	  Normal   Starting                 77s                kubelet          Starting kubelet.
	  Warning  CgroupV1                 77s                kubelet          cgroup v1 support is in maintenance mode, please migrate to cgroup v2
	  Normal   NodeAllocatableEnforced  77s                kubelet          Updated Node Allocatable limit across pods
	  Normal   NodeHasSufficientMemory  77s                kubelet          Node functional-125117 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    77s                kubelet          Node functional-125117 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     77s                kubelet          Node functional-125117 status is now: NodeHasSufficientPID
	  Normal   RegisteredNode           73s                node-controller  Node functional-125117 event: Registered Node functional-125117 in Controller
	
	
	==> dmesg <==
	[Dec19 04:47] overlayfs: idmapped layers are currently not supported
	[Dec19 04:48] overlayfs: idmapped layers are currently not supported
	[Dec19 04:49] overlayfs: idmapped layers are currently not supported
	[Dec19 04:51] overlayfs: idmapped layers are currently not supported
	[Dec19 04:53] overlayfs: idmapped layers are currently not supported
	[Dec19 05:03] overlayfs: idmapped layers are currently not supported
	[Dec19 05:04] overlayfs: idmapped layers are currently not supported
	[Dec19 05:05] overlayfs: idmapped layers are currently not supported
	[Dec19 05:06] overlayfs: idmapped layers are currently not supported
	[ +12.793339] overlayfs: idmapped layers are currently not supported
	[Dec19 05:07] overlayfs: idmapped layers are currently not supported
	[Dec19 05:08] overlayfs: idmapped layers are currently not supported
	[Dec19 05:09] overlayfs: idmapped layers are currently not supported
	[Dec19 05:10] overlayfs: idmapped layers are currently not supported
	[Dec19 05:11] overlayfs: idmapped layers are currently not supported
	[Dec19 05:13] overlayfs: idmapped layers are currently not supported
	[Dec19 05:14] overlayfs: idmapped layers are currently not supported
	[Dec19 05:32] overlayfs: idmapped layers are currently not supported
	[Dec19 05:33] overlayfs: idmapped layers are currently not supported
	[Dec19 05:35] overlayfs: idmapped layers are currently not supported
	[Dec19 05:36] overlayfs: idmapped layers are currently not supported
	[Dec19 05:38] overlayfs: idmapped layers are currently not supported
	[Dec19 05:39] overlayfs: idmapped layers are currently not supported
	[Dec19 05:40] overlayfs: idmapped layers are currently not supported
	[Dec19 05:42] kauditd_printk_skb: 8 callbacks suppressed
	
	
	==> etcd [e6ec1e0e37ba648400fccb6fa32f0cbf03a7dcb4974ea051aa2b9851579e4135] <==
	{"level":"warn","ts":"2025-12-19T05:56:02.566494Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:52630","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-19T05:56:02.597144Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:52638","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-19T05:56:02.604939Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:52666","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-19T05:56:02.635242Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:52678","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-19T05:56:02.655305Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:52710","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-19T05:56:02.665980Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:52718","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-19T05:56:02.689901Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:52724","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-19T05:56:02.701548Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:52746","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-19T05:56:02.727470Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:52764","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-19T05:56:02.752998Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:52780","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-19T05:56:02.769498Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:52802","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-19T05:56:02.792723Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:52830","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-19T05:56:02.869320Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:52848","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-19T05:57:10.570627Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:33742","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-19T05:57:10.595193Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:33752","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-19T05:57:10.640955Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:33776","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-19T05:57:10.671108Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:33808","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-19T05:57:10.703724Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:33814","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-19T05:57:10.739955Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:33832","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-19T05:57:10.814597Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:33872","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-19T05:57:10.818625Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:33858","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-19T05:57:10.829746Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:33878","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-19T05:57:10.857735Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:33896","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-19T05:57:10.876226Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:33918","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2025-12-19T05:57:10.913117Z","caller":"embed/config_logging.go:188","msg":"rejected connection on client endpoint","remote-addr":"127.0.0.1:33936","server-name":"","error":"EOF"}
	
	
	==> kernel <==
	 05:57:24 up 10:39,  0 user,  load average: 1.90, 1.43, 1.58
	Linux functional-125117 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kindnet [1695d1b349c32a60a186f9243745c83c35a6445dff4274de4e55673fea205566] <==
	I1219 05:56:14.315180       1 main.go:139] hostIP = 192.168.49.2
	podIP = 192.168.49.2
	I1219 05:56:14.315587       1 main.go:148] setting mtu 1500 for CNI 
	I1219 05:56:14.315726       1 main.go:178] kindnetd IP family: "ipv4"
	I1219 05:56:14.315771       1 main.go:182] noMask IPv4 subnets: [10.244.0.0/16]
	time="2025-12-19T05:56:14Z" level=info msg="Created plugin 10-kube-network-policies (kindnetd, handles RunPodSandbox,RemovePodSandbox)"
	I1219 05:56:14.525748       1 controller.go:377] "Starting controller" name="kube-network-policies"
	I1219 05:56:14.525859       1 controller.go:381] "Waiting for informer caches to sync"
	I1219 05:56:14.525890       1 shared_informer.go:350] "Waiting for caches to sync" controller="kube-network-policies"
	I1219 05:56:14.526226       1 controller.go:390] nri plugin exited: failed to connect to NRI service: dial unix /var/run/nri/nri.sock: connect: no such file or directory
	I1219 05:56:14.728860       1 shared_informer.go:357] "Caches are synced" controller="kube-network-policies"
	I1219 05:56:14.728887       1 metrics.go:72] Registering metrics
	I1219 05:56:14.728941       1 controller.go:711] "Syncing nftables rules"
	I1219 05:56:24.523205       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1219 05:56:24.523277       1 main.go:301] handling current node
	I1219 05:56:34.523857       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1219 05:56:34.523917       1 main.go:301] handling current node
	I1219 05:56:44.523829       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1219 05:56:44.523892       1 main.go:301] handling current node
	I1219 05:56:54.523942       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1219 05:56:54.524215       1 main.go:301] handling current node
	I1219 05:57:04.523921       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1219 05:57:04.524155       1 main.go:301] handling current node
	I1219 05:57:14.523165       1 main.go:297] Handling node with IPs: map[192.168.49.2:{}]
	I1219 05:57:14.523195       1 main.go:301] handling current node
	
	
	==> kube-apiserver [1b11907af61cdf1d977c26132016a1b5f9944bc16adc124eeabcf1127edd2cdc] <==
	I1219 05:57:04.041152       1 handler.go:285] Adding GroupVersion configuration.konghq.com v1beta1 to ResourceManager
	I1219 05:57:04.072402       1 handler.go:285] Adding GroupVersion configuration.konghq.com v1beta1 to ResourceManager
	I1219 05:57:04.089798       1 handler.go:285] Adding GroupVersion configuration.konghq.com v1alpha1 to ResourceManager
	I1219 05:57:04.106391       1 handler.go:285] Adding GroupVersion configuration.konghq.com v1alpha1 to ResourceManager
	I1219 05:57:04.120612       1 handler.go:285] Adding GroupVersion configuration.konghq.com v1alpha1 to ResourceManager
	I1219 05:57:04.141074       1 handler.go:285] Adding GroupVersion configuration.konghq.com v1beta1 to ResourceManager
	I1219 05:57:04.159888       1 handler.go:285] Adding GroupVersion configuration.konghq.com v1 to ResourceManager
	I1219 05:57:04.176191       1 handler.go:285] Adding GroupVersion configuration.konghq.com v1 to ResourceManager
	I1219 05:57:06.946022       1 alloc.go:328] "allocated clusterIPs" service="kubernetes-dashboard/kubernetes-dashboard-api" clusterIPs={"IPv4":"10.99.159.156"}
	I1219 05:57:06.959518       1 alloc.go:328] "allocated clusterIPs" service="kubernetes-dashboard/kubernetes-dashboard-web" clusterIPs={"IPv4":"10.106.227.80"}
	I1219 05:57:06.981086       1 alloc.go:328] "allocated clusterIPs" service="kubernetes-dashboard/kubernetes-dashboard-kong-proxy" clusterIPs={"IPv4":"10.98.222.17"}
	I1219 05:57:06.981413       1 alloc.go:328] "allocated clusterIPs" service="kubernetes-dashboard/kubernetes-dashboard-metrics-scraper" clusterIPs={"IPv4":"10.106.127.44"}
	I1219 05:57:06.989979       1 alloc.go:328] "allocated clusterIPs" service="kubernetes-dashboard/kubernetes-dashboard-auth" clusterIPs={"IPv4":"10.96.24.112"}
	W1219 05:57:10.560305       1 logging.go:55] [core] [Channel #262 SubChannel #263]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: operation was canceled"
	W1219 05:57:10.593302       1 logging.go:55] [core] [Channel #266 SubChannel #267]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: operation was canceled"
	W1219 05:57:10.638021       1 logging.go:55] [core] [Channel #270 SubChannel #271]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: operation was canceled"
	W1219 05:57:10.670781       1 logging.go:55] [core] [Channel #274 SubChannel #275]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: operation was canceled"
	W1219 05:57:10.696237       1 logging.go:55] [core] [Channel #278 SubChannel #279]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: operation was canceled"
	W1219 05:57:10.739571       1 logging.go:55] [core] [Channel #282 SubChannel #283]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: operation was canceled"
	W1219 05:57:10.781937       1 logging.go:55] [core] [Channel #286 SubChannel #287]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: authentication handshake failed: context canceled"
	W1219 05:57:10.807731       1 logging.go:55] [core] [Channel #290 SubChannel #291]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: operation was canceled"
	W1219 05:57:10.829149       1 logging.go:55] [core] [Channel #294 SubChannel #295]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: operation was canceled"
	W1219 05:57:10.854340       1 logging.go:55] [core] [Channel #298 SubChannel #299]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: authentication handshake failed: context canceled"
	W1219 05:57:10.876418       1 logging.go:55] [core] [Channel #302 SubChannel #303]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: operation was canceled"
	W1219 05:57:10.907481       1 logging.go:55] [core] [Channel #306 SubChannel #307]grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", BalancerAttributes: {"<%!p(pickfirstleaf.managedByPickfirstKeyType={})>": "<%!p(bool=true)>" }}. Err: connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:2379: operation was canceled"
	
	
	==> kube-controller-manager [b4205502d1709aae1862386bc1dc7daa23c91a9a8f3792a70c5175df051a34cb] <==
	I1219 05:56:10.590286       1 shared_informer.go:356] "Caches are synced" controller="PVC protection"
	I1219 05:56:10.590323       1 shared_informer.go:356] "Caches are synced" controller="disruption"
	I1219 05:56:10.590358       1 shared_informer.go:356] "Caches are synced" controller="ReplicaSet"
	I1219 05:56:10.590372       1 shared_informer.go:356] "Caches are synced" controller="TTL after finished"
	I1219 05:56:10.590411       1 shared_informer.go:356] "Caches are synced" controller="endpoint_slice_mirroring"
	I1219 05:56:10.590545       1 shared_informer.go:356] "Caches are synced" controller="cronjob"
	I1219 05:56:10.590589       1 shared_informer.go:356] "Caches are synced" controller="job"
	I1219 05:56:10.590665       1 shared_informer.go:356] "Caches are synced" controller="stateful set"
	I1219 05:56:10.590831       1 shared_informer.go:356] "Caches are synced" controller="service-cidr-controller"
	I1219 05:56:10.591165       1 range_allocator.go:428] "Set node PodCIDR" logger="node-ipam-controller" node="functional-125117" podCIDRs=["10.244.0.0/24"]
	I1219 05:56:10.599902       1 shared_informer.go:356] "Caches are synced" controller="ephemeral"
	I1219 05:56:10.609827       1 shared_informer.go:356] "Caches are synced" controller="garbage collector"
	I1219 05:57:10.550562       1 resource_quota_monitor.go:227] "QuotaMonitor created object count evaluator" logger="resourcequota-controller" resource="udpingresses.configuration.konghq.com"
	I1219 05:57:10.550608       1 resource_quota_monitor.go:227] "QuotaMonitor created object count evaluator" logger="resourcequota-controller" resource="kongconsumergroups.configuration.konghq.com"
	I1219 05:57:10.550631       1 resource_quota_monitor.go:227] "QuotaMonitor created object count evaluator" logger="resourcequota-controller" resource="ingressclassparameterses.configuration.konghq.com"
	I1219 05:57:10.550652       1 resource_quota_monitor.go:227] "QuotaMonitor created object count evaluator" logger="resourcequota-controller" resource="kongcustomentities.configuration.konghq.com"
	I1219 05:57:10.550673       1 resource_quota_monitor.go:227] "QuotaMonitor created object count evaluator" logger="resourcequota-controller" resource="kongupstreampolicies.configuration.konghq.com"
	I1219 05:57:10.550689       1 resource_quota_monitor.go:227] "QuotaMonitor created object count evaluator" logger="resourcequota-controller" resource="tcpingresses.configuration.konghq.com"
	I1219 05:57:10.550710       1 resource_quota_monitor.go:227] "QuotaMonitor created object count evaluator" logger="resourcequota-controller" resource="kongingresses.configuration.konghq.com"
	I1219 05:57:10.550777       1 resource_quota_monitor.go:227] "QuotaMonitor created object count evaluator" logger="resourcequota-controller" resource="kongconsumers.configuration.konghq.com"
	I1219 05:57:10.550798       1 resource_quota_monitor.go:227] "QuotaMonitor created object count evaluator" logger="resourcequota-controller" resource="kongplugins.configuration.konghq.com"
	I1219 05:57:10.550877       1 shared_informer.go:349] "Waiting for caches to sync" controller="resource quota"
	I1219 05:57:10.628892       1 shared_informer.go:349] "Waiting for caches to sync" controller="garbage collector"
	I1219 05:57:11.951915       1 shared_informer.go:356] "Caches are synced" controller="resource quota"
	I1219 05:57:12.029591       1 shared_informer.go:356] "Caches are synced" controller="garbage collector"
	
	
	==> kube-proxy [616e08daf003aa2bdaeae29a78794eb5d850dab46453b3f06af3b916bb8aca9f] <==
	I1219 05:56:14.079436       1 server_linux.go:53] "Using iptables proxy"
	I1219 05:56:14.190092       1 shared_informer.go:349] "Waiting for caches to sync" controller="node informer cache"
	I1219 05:56:14.290220       1 shared_informer.go:356] "Caches are synced" controller="node informer cache"
	I1219 05:56:14.290298       1 server.go:219] "Successfully retrieved NodeIPs" NodeIPs=["192.168.49.2"]
	E1219 05:56:14.290540       1 server.go:256] "Kube-proxy configuration may be incomplete or incorrect" err="nodePortAddresses is unset; NodePort connections will be accepted on all local IPs. Consider using `--nodeport-addresses primary`"
	I1219 05:56:14.310973       1 server.go:265] "kube-proxy running in dual-stack mode" primary ipFamily="IPv4"
	I1219 05:56:14.311038       1 server_linux.go:132] "Using iptables Proxier"
	I1219 05:56:14.319490       1 proxier.go:242] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses" ipFamily="IPv4"
	I1219 05:56:14.319922       1 server.go:527] "Version info" version="v1.34.3"
	I1219 05:56:14.320425       1 server.go:529] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I1219 05:56:14.322923       1 config.go:200] "Starting service config controller"
	I1219 05:56:14.323005       1 shared_informer.go:349] "Waiting for caches to sync" controller="service config"
	I1219 05:56:14.323115       1 config.go:106] "Starting endpoint slice config controller"
	I1219 05:56:14.323160       1 shared_informer.go:349] "Waiting for caches to sync" controller="endpoint slice config"
	I1219 05:56:14.323243       1 config.go:403] "Starting serviceCIDR config controller"
	I1219 05:56:14.323280       1 shared_informer.go:349] "Waiting for caches to sync" controller="serviceCIDR config"
	I1219 05:56:14.324235       1 config.go:309] "Starting node config controller"
	I1219 05:56:14.324308       1 shared_informer.go:349] "Waiting for caches to sync" controller="node config"
	I1219 05:56:14.324405       1 shared_informer.go:356] "Caches are synced" controller="node config"
	I1219 05:56:14.424997       1 shared_informer.go:356] "Caches are synced" controller="serviceCIDR config"
	I1219 05:56:14.425146       1 shared_informer.go:356] "Caches are synced" controller="service config"
	I1219 05:56:14.425314       1 shared_informer.go:356] "Caches are synced" controller="endpoint slice config"
	
	
	==> kube-scheduler [80f5fc9caf3e4756bcd590fcb7b88ee219fc39b293345a00de18c3859326a6eb] <==
	E1219 05:56:03.641415       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User \"system:kube-scheduler\" cannot list resource \"replicasets\" in API group \"apps\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ReplicaSet"
	E1219 05:56:03.641661       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver"
	E1219 05:56:03.641846       1 reflector.go:205] "Failed to watch" err="failed to list *v1.VolumeAttachment: volumeattachments.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"volumeattachments\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.VolumeAttachment"
	E1219 05:56:03.642007       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIStorageCapacity: csistoragecapacities.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"csistoragecapacities\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIStorageCapacity"
	E1219 05:56:03.642159       1 reflector.go:205] "Failed to watch" err="failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"storageclasses\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.StorageClass"
	E1219 05:56:03.642334       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Namespace: namespaces is forbidden: User \"system:kube-scheduler\" cannot list resource \"namespaces\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Namespace"
	E1219 05:56:03.642489       1 reflector.go:205] "Failed to watch" err="failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User \"system:kube-scheduler\" cannot list resource \"statefulsets\" in API group \"apps\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.StatefulSet"
	E1219 05:56:03.642670       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ResourceClaim: resourceclaims.resource.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"resourceclaims\" in API group \"resource.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ResourceClaim"
	E1219 05:56:04.459604       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver"
	E1219 05:56:04.479265       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User \"system:kube-scheduler\" cannot list resource \"persistentvolumeclaims\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PersistentVolumeClaim"
	E1219 05:56:04.503174       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIStorageCapacity: csistoragecapacities.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"csistoragecapacities\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIStorageCapacity"
	E1219 05:56:04.529209       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User \"system:kube-scheduler\" cannot list resource \"replicasets\" in API group \"apps\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ReplicaSet"
	E1219 05:56:04.545665       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ResourceSlice: resourceslices.resource.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"resourceslices\" in API group \"resource.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ResourceSlice"
	E1219 05:56:04.598777       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Pod: pods is forbidden: User \"system:kube-scheduler\" cannot list resource \"pods\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Pod"
	E1219 05:56:04.608720       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:kube-scheduler\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service"
	E1219 05:56:04.618361       1 reflector.go:205] "Failed to watch" err="failed to list *v1.DeviceClass: deviceclasses.resource.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"deviceclasses\" in API group \"resource.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.DeviceClass"
	E1219 05:56:04.665006       1 reflector.go:205] "Failed to watch" err="failed to list *v1.VolumeAttachment: volumeattachments.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"volumeattachments\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.VolumeAttachment"
	E1219 05:56:04.738829       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User \"system:kube-scheduler\" cannot list resource \"replicationcontrollers\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ReplicationController"
	E1219 05:56:04.772569       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"extension-apiserver-authentication\" is forbidden: User \"system:kube-scheduler\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"kube-system\"" logger="UnhandledError" reflector="runtime/asm_arm64.s:1223" type="*v1.ConfigMap"
	E1219 05:56:04.806122       1 reflector.go:205] "Failed to watch" err="failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"csinodes\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSINode"
	E1219 05:56:04.817648       1 reflector.go:205] "Failed to watch" err="failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User \"system:kube-scheduler\" cannot list resource \"statefulsets\" in API group \"apps\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.StatefulSet"
	E1219 05:56:04.832812       1 reflector.go:205] "Failed to watch" err="failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User \"system:kube-scheduler\" cannot list resource \"persistentvolumes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.PersistentVolume"
	E1219 05:56:04.856331       1 reflector.go:205] "Failed to watch" err="failed to list *v1.ResourceClaim: resourceclaims.resource.k8s.io is forbidden: User \"system:kube-scheduler\" cannot list resource \"resourceclaims\" in API group \"resource.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.ResourceClaim"
	E1219 05:56:04.865034       1 reflector.go:205] "Failed to watch" err="failed to list *v1.Namespace: namespaces is forbidden: User \"system:kube-scheduler\" cannot list resource \"namespaces\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Namespace"
	I1219 05:56:06.702294       1 shared_informer.go:356] "Caches are synced" controller="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	
	
	==> kubelet <==
	Dec 19 05:57:06 functional-125117 kubelet[16669]: E1219 05:57:06.432669   16669 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"76c58cb980cd99548d8e38bf12a2d083cbce3ddb4d6369b981fa6103aae67cae\": not found" containerID="76c58cb980cd99548d8e38bf12a2d083cbce3ddb4d6369b981fa6103aae67cae"
	Dec 19 05:57:06 functional-125117 kubelet[16669]: I1219 05:57:06.432706   16669 kuberuntime_gc.go:364] "Error getting ContainerStatus for containerID" containerID="76c58cb980cd99548d8e38bf12a2d083cbce3ddb4d6369b981fa6103aae67cae" err="rpc error: code = NotFound desc = an error occurred when try to find container \"76c58cb980cd99548d8e38bf12a2d083cbce3ddb4d6369b981fa6103aae67cae\": not found"
	Dec 19 05:57:07 functional-125117 kubelet[16669]: I1219 05:57:07.373175   16669 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jpzk\" (UniqueName: \"kubernetes.io/projected/c81fd795-c18d-48e3-8218-8a686d492c5d-kube-api-access-2jpzk\") pod \"kubernetes-dashboard-metrics-scraper-7685fd8b77-nrkjq\" (UID: \"c81fd795-c18d-48e3-8218-8a686d492c5d\") " pod="kubernetes-dashboard/kubernetes-dashboard-metrics-scraper-7685fd8b77-nrkjq"
	Dec 19 05:57:07 functional-125117 kubelet[16669]: I1219 05:57:07.373238   16669 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/c81fd795-c18d-48e3-8218-8a686d492c5d-tmp-volume\") pod \"kubernetes-dashboard-metrics-scraper-7685fd8b77-nrkjq\" (UID: \"c81fd795-c18d-48e3-8218-8a686d492c5d\") " pod="kubernetes-dashboard/kubernetes-dashboard-metrics-scraper-7685fd8b77-nrkjq"
	Dec 19 05:57:07 functional-125117 kubelet[16669]: I1219 05:57:07.486025   16669 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kong-custom-dbless-config-volume\" (UniqueName: \"kubernetes.io/configmap/bd3b7e2e-bcbb-4978-902e-5558b3b23bfc-kong-custom-dbless-config-volume\") pod \"kubernetes-dashboard-kong-9849c64bd-527k2\" (UID: \"bd3b7e2e-bcbb-4978-902e-5558b3b23bfc\") " pod="kubernetes-dashboard/kubernetes-dashboard-kong-9849c64bd-527k2"
	Dec 19 05:57:07 functional-125117 kubelet[16669]: I1219 05:57:07.497512   16669 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5hpw\" (UniqueName: \"kubernetes.io/projected/40774d56-9b04-4276-ba8e-be42826d0b69-kube-api-access-g5hpw\") pod \"kubernetes-dashboard-api-5f6dd64f4-wz2l2\" (UID: \"40774d56-9b04-4276-ba8e-be42826d0b69\") " pod="kubernetes-dashboard/kubernetes-dashboard-api-5f6dd64f4-wz2l2"
	Dec 19 05:57:07 functional-125117 kubelet[16669]: I1219 05:57:07.504678   16669 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/d59727e7-0af4-4692-98bb-ec7f65c9f87a-tmp-volume\") pod \"kubernetes-dashboard-web-5c9f966b98-wllqv\" (UID: \"d59727e7-0af4-4692-98bb-ec7f65c9f87a\") " pod="kubernetes-dashboard/kubernetes-dashboard-web-5c9f966b98-wllqv"
	Dec 19 05:57:07 functional-125117 kubelet[16669]: I1219 05:57:07.504961   16669 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/40774d56-9b04-4276-ba8e-be42826d0b69-tmp-volume\") pod \"kubernetes-dashboard-api-5f6dd64f4-wz2l2\" (UID: \"40774d56-9b04-4276-ba8e-be42826d0b69\") " pod="kubernetes-dashboard/kubernetes-dashboard-api-5f6dd64f4-wz2l2"
	Dec 19 05:57:07 functional-125117 kubelet[16669]: I1219 05:57:07.505099   16669 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubernetes-dashboard-kong-prefix-dir\" (UniqueName: \"kubernetes.io/empty-dir/bd3b7e2e-bcbb-4978-902e-5558b3b23bfc-kubernetes-dashboard-kong-prefix-dir\") pod \"kubernetes-dashboard-kong-9849c64bd-527k2\" (UID: \"bd3b7e2e-bcbb-4978-902e-5558b3b23bfc\") " pod="kubernetes-dashboard/kubernetes-dashboard-kong-9849c64bd-527k2"
	Dec 19 05:57:07 functional-125117 kubelet[16669]: I1219 05:57:07.505233   16669 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubernetes-dashboard-kong-tmp\" (UniqueName: \"kubernetes.io/empty-dir/bd3b7e2e-bcbb-4978-902e-5558b3b23bfc-kubernetes-dashboard-kong-tmp\") pod \"kubernetes-dashboard-kong-9849c64bd-527k2\" (UID: \"bd3b7e2e-bcbb-4978-902e-5558b3b23bfc\") " pod="kubernetes-dashboard/kubernetes-dashboard-kong-9849c64bd-527k2"
	Dec 19 05:57:07 functional-125117 kubelet[16669]: I1219 05:57:07.505374   16669 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qrg5\" (UniqueName: \"kubernetes.io/projected/d59727e7-0af4-4692-98bb-ec7f65c9f87a-kube-api-access-5qrg5\") pod \"kubernetes-dashboard-web-5c9f966b98-wllqv\" (UID: \"d59727e7-0af4-4692-98bb-ec7f65c9f87a\") " pod="kubernetes-dashboard/kubernetes-dashboard-web-5c9f966b98-wllqv"
	Dec 19 05:57:07 functional-125117 kubelet[16669]: I1219 05:57:07.606591   16669 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/64090409-d0e5-4576-b8f0-e77c4b56c9f4-tmp-volume\") pod \"kubernetes-dashboard-auth-55fb9bbdf8-4b2hl\" (UID: \"64090409-d0e5-4576-b8f0-e77c4b56c9f4\") " pod="kubernetes-dashboard/kubernetes-dashboard-auth-55fb9bbdf8-4b2hl"
	Dec 19 05:57:07 functional-125117 kubelet[16669]: I1219 05:57:07.606707   16669 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vdc7\" (UniqueName: \"kubernetes.io/projected/64090409-d0e5-4576-b8f0-e77c4b56c9f4-kube-api-access-6vdc7\") pod \"kubernetes-dashboard-auth-55fb9bbdf8-4b2hl\" (UID: \"64090409-d0e5-4576-b8f0-e77c4b56c9f4\") " pod="kubernetes-dashboard/kubernetes-dashboard-auth-55fb9bbdf8-4b2hl"
	Dec 19 05:57:16 functional-125117 kubelet[16669]: I1219 05:57:16.821597   16669 kubelet_resources.go:64] "Allocatable" allocatable={"cpu":"2","ephemeral-storage":"203034800Ki","hugepages-1Gi":"0","hugepages-2Mi":"0","hugepages-32Mi":"0","hugepages-64Ki":"0","memory":"8022300Ki","pods":"110"}
	Dec 19 05:57:16 functional-125117 kubelet[16669]: I1219 05:57:16.821698   16669 kubelet_resources.go:64] "Allocatable" allocatable={"cpu":"2","ephemeral-storage":"203034800Ki","hugepages-1Gi":"0","hugepages-2Mi":"0","hugepages-32Mi":"0","hugepages-64Ki":"0","memory":"8022300Ki","pods":"110"}
	Dec 19 05:57:17 functional-125117 kubelet[16669]: I1219 05:57:17.749742   16669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kubernetes-dashboard/kubernetes-dashboard-kong-9849c64bd-527k2" podStartSLOduration=3.693364792 podStartE2EDuration="10.749720965s" podCreationTimestamp="2025-12-19 05:57:07 +0000 UTC" firstStartedPulling="2025-12-19 05:57:08.072011279 +0000 UTC m=+61.933041592" lastFinishedPulling="2025-12-19 05:57:15.128367452 +0000 UTC m=+68.989397765" observedRunningTime="2025-12-19 05:57:17.718313133 +0000 UTC m=+71.579343462" watchObservedRunningTime="2025-12-19 05:57:17.749720965 +0000 UTC m=+71.610751278"
	Dec 19 05:57:21 functional-125117 kubelet[16669]: I1219 05:57:21.122150   16669 kubelet_resources.go:64] "Allocatable" allocatable={"cpu":"2","ephemeral-storage":"203034800Ki","hugepages-1Gi":"0","hugepages-2Mi":"0","hugepages-32Mi":"0","hugepages-64Ki":"0","memory":"8022300Ki","pods":"110"}
	Dec 19 05:57:21 functional-125117 kubelet[16669]: I1219 05:57:21.122225   16669 kubelet_resources.go:64] "Allocatable" allocatable={"cpu":"2","ephemeral-storage":"203034800Ki","hugepages-1Gi":"0","hugepages-2Mi":"0","hugepages-32Mi":"0","hugepages-64Ki":"0","memory":"8022300Ki","pods":"110"}
	Dec 19 05:57:21 functional-125117 kubelet[16669]: I1219 05:57:21.761389   16669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kubernetes-dashboard/kubernetes-dashboard-api-5f6dd64f4-wz2l2" podStartSLOduration=6.060692172 podStartE2EDuration="14.761369975s" podCreationTimestamp="2025-12-19 05:57:07 +0000 UTC" firstStartedPulling="2025-12-19 05:57:08.120715714 +0000 UTC m=+61.981746026" lastFinishedPulling="2025-12-19 05:57:16.821393516 +0000 UTC m=+70.682423829" observedRunningTime="2025-12-19 05:57:17.750176626 +0000 UTC m=+71.611206955" watchObservedRunningTime="2025-12-19 05:57:21.761369975 +0000 UTC m=+75.622400288"
	Dec 19 05:57:22 functional-125117 kubelet[16669]: I1219 05:57:22.197856   16669 kubelet_resources.go:64] "Allocatable" allocatable={"cpu":"2","ephemeral-storage":"203034800Ki","hugepages-1Gi":"0","hugepages-2Mi":"0","hugepages-32Mi":"0","hugepages-64Ki":"0","memory":"8022300Ki","pods":"110"}
	Dec 19 05:57:22 functional-125117 kubelet[16669]: I1219 05:57:22.197955   16669 kubelet_resources.go:64] "Allocatable" allocatable={"cpu":"2","ephemeral-storage":"203034800Ki","hugepages-1Gi":"0","hugepages-2Mi":"0","hugepages-32Mi":"0","hugepages-64Ki":"0","memory":"8022300Ki","pods":"110"}
	Dec 19 05:57:22 functional-125117 kubelet[16669]: I1219 05:57:22.769765   16669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kubernetes-dashboard/kubernetes-dashboard-metrics-scraper-7685fd8b77-nrkjq" podStartSLOduration=1.866909063 podStartE2EDuration="15.769724862s" podCreationTimestamp="2025-12-19 05:57:07 +0000 UTC" firstStartedPulling="2025-12-19 05:57:08.294783906 +0000 UTC m=+62.155814218" lastFinishedPulling="2025-12-19 05:57:22.197599696 +0000 UTC m=+76.058630017" observedRunningTime="2025-12-19 05:57:22.769431074 +0000 UTC m=+76.630461420" watchObservedRunningTime="2025-12-19 05:57:22.769724862 +0000 UTC m=+76.630755232"
	Dec 19 05:57:22 functional-125117 kubelet[16669]: I1219 05:57:22.770476   16669 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kubernetes-dashboard/kubernetes-dashboard-web-5c9f966b98-wllqv" podStartSLOduration=2.797449284 podStartE2EDuration="15.770463241s" podCreationTimestamp="2025-12-19 05:57:07 +0000 UTC" firstStartedPulling="2025-12-19 05:57:08.148574323 +0000 UTC m=+62.009604636" lastFinishedPulling="2025-12-19 05:57:21.12158828 +0000 UTC m=+74.982618593" observedRunningTime="2025-12-19 05:57:21.761037902 +0000 UTC m=+75.622068223" watchObservedRunningTime="2025-12-19 05:57:22.770463241 +0000 UTC m=+76.631493562"
	Dec 19 05:57:23 functional-125117 kubelet[16669]: I1219 05:57:23.417034   16669 kubelet_resources.go:64] "Allocatable" allocatable={"cpu":"2","ephemeral-storage":"203034800Ki","hugepages-1Gi":"0","hugepages-2Mi":"0","hugepages-32Mi":"0","hugepages-64Ki":"0","memory":"8022300Ki","pods":"110"}
	Dec 19 05:57:23 functional-125117 kubelet[16669]: I1219 05:57:23.417620   16669 kubelet_resources.go:64] "Allocatable" allocatable={"cpu":"2","ephemeral-storage":"203034800Ki","hugepages-1Gi":"0","hugepages-2Mi":"0","hugepages-32Mi":"0","hugepages-64Ki":"0","memory":"8022300Ki","pods":"110"}
	
	
	==> kubernetes-dashboard [29f9704c03ef922b8caa0b1f4f2a3bfcd7eb0990250215e8c3580c91d26110d8] <==
	I1219 05:57:22.360726       1 main.go:43] "Starting Metrics Scraper" version="1.2.2"
	W1219 05:57:22.360841       1 client_config.go:667] Neither --kubeconfig nor --master was specified.  Using the inClusterConfig.  This might not work.
	I1219 05:57:22.360954       1 main.go:51] Kubernetes host: https://10.96.0.1:443
	I1219 05:57:22.360960       1 main.go:52] Namespace(s): []
	
	
	==> kubernetes-dashboard [89d78fe89c73a68b9128e7228e35a5b9296f313994f16ba47daf809095b14825] <==
	I1219 05:57:17.031602       1 main.go:40] "Starting Kubernetes Dashboard API" version="1.14.0"
	I1219 05:57:17.031786       1 init.go:49] Using in-cluster config
	I1219 05:57:17.032037       1 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I1219 05:57:17.032049       1 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I1219 05:57:17.032054       1 envvar.go:172] "Feature gate default state" feature="ClientsAllowCBOR" enabled=false
	I1219 05:57:17.032058       1 envvar.go:172] "Feature gate default state" feature="ClientsPreferCBOR" enabled=false
	I1219 05:57:17.045001       1 main.go:119] "Successful initial request to the apiserver" version="v1.34.3"
	I1219 05:57:17.045036       1 client.go:265] Creating in-cluster Sidecar client
	I1219 05:57:17.116408       1 main.go:96] "Listening and serving on" address="0.0.0.0:8000"
	E1219 05:57:17.119466       1 manager.go:96] Metric client health check failed: the server is currently unable to handle the request (get services kubernetes-dashboard-metrics-scraper). Retrying in 30 seconds.
	
	
	==> kubernetes-dashboard [985ee70cda830123987b6d97ba524b9cf90dab1dd81fff6785c740071638f8a4] <==
	I1219 05:57:21.342523       1 main.go:37] "Starting Kubernetes Dashboard Web" version="1.7.0"
	I1219 05:57:21.342593       1 init.go:48] Using in-cluster config
	I1219 05:57:21.342964       1 main.go:57] "Listening and serving insecurely on" address="0.0.0.0:8000"
	
	
	==> kubernetes-dashboard [d56bac3d8d52fba6a6ff205aef02077569096618ca3b77f900e4c2ac7d44d3b9] <==
	I1219 05:57:23.615872       1 main.go:34] "Starting Kubernetes Dashboard Auth" version="1.4.0"
	I1219 05:57:23.615974       1 init.go:49] Using in-cluster config
	I1219 05:57:23.616097       1 main.go:44] "Listening and serving insecurely on" address="0.0.0.0:8000"
	
	
	==> storage-provisioner [43e1c32c12ad7b3f9b354b1eb10a2323efadcd73ab44c88d1b5ee33160e6dd2e] <==
	W1219 05:56:59.670173       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1219 05:57:01.673418       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1219 05:57:01.678181       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1219 05:57:03.681262       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1219 05:57:03.686122       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1219 05:57:05.689882       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1219 05:57:05.698701       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1219 05:57:07.722145       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1219 05:57:07.738520       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1219 05:57:09.741659       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1219 05:57:09.746965       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1219 05:57:11.752062       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1219 05:57:11.758584       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1219 05:57:13.762547       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1219 05:57:13.774586       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1219 05:57:15.782555       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1219 05:57:15.789346       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1219 05:57:17.793431       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1219 05:57:17.805404       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1219 05:57:19.808477       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1219 05:57:19.821654       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1219 05:57:21.824490       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1219 05:57:21.829289       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1219 05:57:23.832493       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	W1219 05:57:23.837613       1 warnings.go:70] v1 Endpoints is deprecated in v1.33+; use discovery.k8s.io/v1 EndpointSlice
	

                                                
                                                
-- /stdout --
helpers_test.go:263: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-125117 -n functional-125117
helpers_test.go:270: (dbg) Run:  kubectl --context functional-125117 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:281: non-running pods: busybox-mount
helpers_test.go:283: ======> post-mortem[TestFunctional/parallel/DashboardCmd]: describe non-running pods <======
helpers_test.go:286: (dbg) Run:  kubectl --context functional-125117 describe pod busybox-mount
helpers_test.go:291: (dbg) kubectl --context functional-125117 describe pod busybox-mount:

                                                
                                                
-- stdout --
	Name:             busybox-mount
	Namespace:        default
	Priority:         0
	Service Account:  default
	Node:             functional-125117/192.168.49.2
	Start Time:       Fri, 19 Dec 2025 05:56:52 +0000
	Labels:           integration-test=busybox-mount
	Annotations:      <none>
	Status:           Succeeded
	IP:               10.244.0.11
	IPs:
	  IP:  10.244.0.11
	Containers:
	  mount-munger:
	    Container ID:  containerd://b1ee8306f19fd7b881ca18f2bd82b6bf7345fee7183bdceaac9a98559b6767f9
	    Image:         gcr.io/k8s-minikube/busybox:1.28.4-glibc
	    Image ID:      gcr.io/k8s-minikube/busybox@sha256:2d03e6ceeb99250061dd110530b0ece7998cd84121f952adef120ea7c5a6f00e
	    Port:          <none>
	    Host Port:     <none>
	    Command:
	      /bin/sh
	      -c
	      --
	    Args:
	      cat /mount-9p/created-by-test; echo test > /mount-9p/created-by-pod; rm /mount-9p/created-by-test-removed-by-pod; echo test > /mount-9p/created-by-pod-removed-by-test date >> /mount-9p/pod-dates
	    State:          Terminated
	      Reason:       Completed
	      Exit Code:    0
	      Started:      Fri, 19 Dec 2025 05:56:55 +0000
	      Finished:     Fri, 19 Dec 2025 05:56:55 +0000
	    Ready:          False
	    Restart Count:  0
	    Environment:    <none>
	    Mounts:
	      /mount-9p from test-volume (rw)
	      /var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-rn4dz (ro)
	Conditions:
	  Type                        Status
	  PodReadyToStartContainers   False 
	  Initialized                 True 
	  Ready                       False 
	  ContainersReady             False 
	  PodScheduled                True 
	Volumes:
	  test-volume:
	    Type:          HostPath (bare host directory volume)
	    Path:          /mount-9p
	    HostPathType:  
	  kube-api-access-rn4dz:
	    Type:                    Projected (a volume that contains injected data from multiple sources)
	    TokenExpirationSeconds:  3607
	    ConfigMapName:           kube-root-ca.crt
	    Optional:                false
	    DownwardAPI:             true
	QoS Class:                   BestEffort
	Node-Selectors:              <none>
	Tolerations:                 node.kubernetes.io/not-ready:NoExecute op=Exists for 300s
	                             node.kubernetes.io/unreachable:NoExecute op=Exists for 300s
	Events:
	  Type    Reason     Age   From               Message
	  ----    ------     ----  ----               -------
	  Normal  Scheduled  33s   default-scheduler  Successfully assigned default/busybox-mount to functional-125117
	  Normal  Pulling    33s   kubelet            Pulling image "gcr.io/k8s-minikube/busybox:1.28.4-glibc"
	  Normal  Pulled     31s   kubelet            Successfully pulled image "gcr.io/k8s-minikube/busybox:1.28.4-glibc" in 2.215s (2.215s including waiting). Image size: 1935750 bytes.
	  Normal  Created    31s   kubelet            Created container: mount-munger
	  Normal  Started    30s   kubelet            Started container mount-munger

                                                
                                                
-- /stdout --
helpers_test.go:294: <<< TestFunctional/parallel/DashboardCmd FAILED: end of post-mortem logs <<<
helpers_test.go:295: ---------------------/post-mortem---------------------------------
--- FAIL: TestFunctional/parallel/DashboardCmd (23.96s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/StartWithProxy (502.35s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/StartWithProxy
functional_test.go:2239: (dbg) Run:  out/minikube-linux-arm64 start -p functional-006924 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-rc.1
E1219 06:00:27.490830 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/addons-064622/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 06:01:29.405870 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-125117/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 06:01:29.411163 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-125117/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 06:01:29.421894 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-125117/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 06:01:29.442201 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-125117/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 06:01:29.482576 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-125117/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 06:01:29.562929 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-125117/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 06:01:29.723386 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-125117/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 06:01:30.044065 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-125117/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 06:01:30.685105 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-125117/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 06:01:31.966028 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-125117/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 06:01:34.527833 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-125117/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 06:01:39.648516 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-125117/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 06:01:49.889659 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-125117/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 06:02:10.370013 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-125117/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 06:02:51.330322 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-125117/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 06:04:13.250636 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-125117/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 06:05:27.486475 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/addons-064622/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
functional_test.go:2239: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p functional-006924 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-rc.1: exit status 109 (8m20.88567s)

                                                
                                                
-- stdout --
	* [functional-006924] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=22230
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/22230-1998525/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/22230-1998525/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the docker driver based on user configuration
	* Using Docker driver with root privileges
	* Starting "functional-006924" primary control-plane node in "functional-006924" cluster
	* Pulling base image v0.0.48-1765966054-22186 ...
	* Found network options:
	  - HTTP_PROXY=localhost:39775
	* Please see https://minikube.sigs.k8s.io/docs/handbook/vpn_and_proxy/ for more details
	* Preparing Kubernetes v1.35.0-rc.1 on containerd 2.2.0 ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	! Local proxy ignored: not passing HTTP_PROXY=localhost:39775 to docker env.
	! You appear to be using a proxy, but your NO_PROXY environment does not include the minikube IP (192.168.49.2).
	! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Generating "apiserver-kubelet-client" certificate and key
	[certs] Generating "front-proxy-ca" certificate and key
	[certs] Generating "front-proxy-client" certificate and key
	[certs] Generating "etcd/ca" certificate and key
	[certs] Generating "etcd/server" certificate and key
	[certs] etcd/server serving cert is signed for DNS names [functional-006924 localhost] and IPs [192.168.49.2 127.0.0.1 ::1]
	[certs] Generating "etcd/peer" certificate and key
	[certs] etcd/peer serving cert is signed for DNS names [functional-006924 localhost] and IPs [192.168.49.2 127.0.0.1 ::1]
	[certs] Generating "etcd/healthcheck-client" certificate and key
	[certs] Generating "apiserver-etcd-client" certificate and key
	[certs] Generating "sa" key and public key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001039228s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	* 
	X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000326287s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000326287s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	* Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	* Related issue: https://github.com/kubernetes/minikube/issues/4172

                                                
                                                
** /stderr **
functional_test.go:2241: failed minikube start. args "out/minikube-linux-arm64 start -p functional-006924 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-rc.1": exit status 109
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:224: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/StartWithProxy]: network settings <======
helpers_test.go:231: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:239: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/StartWithProxy]: docker inspect <======
helpers_test.go:240: (dbg) Run:  docker inspect functional-006924
helpers_test.go:244: (dbg) docker inspect functional-006924:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6",
	        "Created": "2025-12-19T05:57:32.987616309Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 2053574,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-19T05:57:33.050252475Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:1411dfa4fea1291ce69fcd55acb99f3fbff3e701cee30fdd4f0b2561ac0ef6b0",
	        "ResolvConfPath": "/var/lib/docker/containers/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6/hostname",
	        "HostsPath": "/var/lib/docker/containers/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6/hosts",
	        "LogPath": "/var/lib/docker/containers/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6-json.log",
	        "Name": "/functional-006924",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-006924:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-006924",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6",
	                "LowerDir": "/var/lib/docker/overlay2/37e64f46ab514c62069e4338bcac4816059c7746e8935882d7526ec5f3a01f73-init/diff:/var/lib/docker/overlay2/00358d85eab3b52f9d297862c5ac97673efd866f7bb8f8781bf0c1744f50abc5/diff",
	                "MergedDir": "/var/lib/docker/overlay2/37e64f46ab514c62069e4338bcac4816059c7746e8935882d7526ec5f3a01f73/merged",
	                "UpperDir": "/var/lib/docker/overlay2/37e64f46ab514c62069e4338bcac4816059c7746e8935882d7526ec5f3a01f73/diff",
	                "WorkDir": "/var/lib/docker/overlay2/37e64f46ab514c62069e4338bcac4816059c7746e8935882d7526ec5f3a01f73/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-006924",
	                "Source": "/var/lib/docker/volumes/functional-006924/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-006924",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-006924",
	                "name.minikube.sigs.k8s.io": "functional-006924",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "c06ab2bd44169716d410789ed39ed6e7c04e20cbf7fddb96691439282b9c97ca",
	            "SandboxKey": "/var/run/docker/netns/c06ab2bd4416",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34704"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34705"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34708"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34706"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34707"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-006924": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "42:2f:87:6a:a8:7b",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "f63e8dc2cff83663f8a4d14108f192e61e457410fa4fc720cd9630dbf354815d",
	                    "EndpointID": "aa2b1cbd90d5c1f6130481423d97f82d974d4197e41ad0dbe3b7e51b22c8b4cc",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-006924",
	                        "651d0d6ef1db"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:248: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-006924 -n functional-006924
helpers_test.go:248: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-006924 -n functional-006924: exit status 6 (319.326614ms)

                                                
                                                
-- stdout --
	Running
	WARNING: Your kubectl is pointing to stale minikube-vm.
	To fix the kubectl context, run `minikube update-context`

                                                
                                                
-- /stdout --
** stderr ** 
	E1219 06:05:49.357352 2058758 status.go:458] kubeconfig endpoint: get endpoint: "functional-006924" does not appear in /home/jenkins/minikube-integration/22230-1998525/kubeconfig

                                                
                                                
** /stderr **
helpers_test.go:248: status error: exit status 6 (may be ok)
helpers_test.go:253: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/StartWithProxy FAILED: start of post-mortem logs <<<
helpers_test.go:254: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/StartWithProxy]: minikube logs <======
helpers_test.go:256: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 logs -n 25
helpers_test.go:261: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/StartWithProxy logs: 
-- stdout --
	
	==> Audit <==
	┌────────────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│    COMMAND     │                                                                              ARGS                                                                               │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├────────────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ image          │ functional-125117 image save kicbase/echo-server:functional-125117 /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image rm kicbase/echo-server:functional-125117 --alsologtostderr                                                                              │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image ls                                                                                                                                      │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image load /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr                                       │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image ls                                                                                                                                      │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image save --daemon kicbase/echo-server:functional-125117 --alsologtostderr                                                                   │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ ssh            │ functional-125117 ssh sudo cat /etc/ssl/certs/2000386.pem                                                                                                       │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ ssh            │ functional-125117 ssh sudo cat /usr/share/ca-certificates/2000386.pem                                                                                           │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ ssh            │ functional-125117 ssh sudo cat /etc/ssl/certs/51391683.0                                                                                                        │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ ssh            │ functional-125117 ssh sudo cat /etc/ssl/certs/20003862.pem                                                                                                      │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ ssh            │ functional-125117 ssh sudo cat /usr/share/ca-certificates/20003862.pem                                                                                          │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ ssh            │ functional-125117 ssh sudo cat /etc/ssl/certs/3ec20f2e.0                                                                                                        │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ ssh            │ functional-125117 ssh sudo cat /etc/test/nested/copy/2000386/hosts                                                                                              │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image ls --format short --alsologtostderr                                                                                                     │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image ls --format yaml --alsologtostderr                                                                                                      │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ ssh            │ functional-125117 ssh pgrep buildkitd                                                                                                                           │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │                     │
	│ image          │ functional-125117 image build -t localhost/my-image:functional-125117 testdata/build --alsologtostderr                                                          │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image ls                                                                                                                                      │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image ls --format json --alsologtostderr                                                                                                      │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image ls --format table --alsologtostderr                                                                                                     │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ update-context │ functional-125117 update-context --alsologtostderr -v=2                                                                                                         │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ update-context │ functional-125117 update-context --alsologtostderr -v=2                                                                                                         │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ update-context │ functional-125117 update-context --alsologtostderr -v=2                                                                                                         │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ delete         │ -p functional-125117                                                                                                                                            │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ start          │ -p functional-006924 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-rc.1           │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │                     │
	└────────────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/19 05:57:28
	Running on machine: ip-172-31-30-239
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1219 05:57:28.181232 2053190 out.go:360] Setting OutFile to fd 1 ...
	I1219 05:57:28.181334 2053190 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 05:57:28.181338 2053190 out.go:374] Setting ErrFile to fd 2...
	I1219 05:57:28.181342 2053190 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 05:57:28.182343 2053190 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22230-1998525/.minikube/bin
	I1219 05:57:28.182828 2053190 out.go:368] Setting JSON to false
	I1219 05:57:28.183678 2053190 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":38395,"bootTime":1766085454,"procs":152,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1219 05:57:28.183736 2053190 start.go:143] virtualization:  
	I1219 05:57:28.188531 2053190 out.go:179] * [functional-006924] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1219 05:57:28.192078 2053190 out.go:179]   - MINIKUBE_LOCATION=22230
	I1219 05:57:28.192158 2053190 notify.go:221] Checking for updates...
	I1219 05:57:28.198989 2053190 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1219 05:57:28.202302 2053190 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22230-1998525/kubeconfig
	I1219 05:57:28.205562 2053190 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22230-1998525/.minikube
	I1219 05:57:28.208741 2053190 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1219 05:57:28.211872 2053190 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1219 05:57:28.215192 2053190 driver.go:422] Setting default libvirt URI to qemu:///system
	I1219 05:57:28.245074 2053190 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1219 05:57:28.245195 2053190 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1219 05:57:28.311471 2053190 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:23 OomKillDisable:true NGoroutines:42 SystemTime:2025-12-19 05:57:28.302497694 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1219 05:57:28.311559 2053190 docker.go:319] overlay module found
	I1219 05:57:28.314759 2053190 out.go:179] * Using the docker driver based on user configuration
	I1219 05:57:28.317821 2053190 start.go:309] selected driver: docker
	I1219 05:57:28.317829 2053190 start.go:928] validating driver "docker" against <nil>
	I1219 05:57:28.317840 2053190 start.go:939] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1219 05:57:28.318574 2053190 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1219 05:57:28.373914 2053190 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:23 OomKillDisable:true NGoroutines:42 SystemTime:2025-12-19 05:57:28.365095592 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1219 05:57:28.374052 2053190 start_flags.go:327] no existing cluster config was found, will generate one from the flags 
	I1219 05:57:28.374256 2053190 start_flags.go:993] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1219 05:57:28.377210 2053190 out.go:179] * Using Docker driver with root privileges
	I1219 05:57:28.380032 2053190 cni.go:84] Creating CNI manager for ""
	I1219 05:57:28.380085 2053190 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1219 05:57:28.380095 2053190 start_flags.go:336] Found "CNI" CNI - setting NetworkPlugin=cni
	I1219 05:57:28.380159 2053190 start.go:353] cluster config:
	{Name:functional-006924 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Con
tainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SS
HAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1219 05:57:28.383325 2053190 out.go:179] * Starting "functional-006924" primary control-plane node in "functional-006924" cluster
	I1219 05:57:28.386141 2053190 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1219 05:57:28.389059 2053190 out.go:179] * Pulling base image v0.0.48-1765966054-22186 ...
	I1219 05:57:28.391840 2053190 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1219 05:57:28.391878 2053190 preload.go:203] Found local preload: /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4
	I1219 05:57:28.391887 2053190 cache.go:65] Caching tarball of preloaded images
	I1219 05:57:28.391919 2053190 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon
	I1219 05:57:28.391983 2053190 preload.go:238] Found /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4 in cache, skipping download
	I1219 05:57:28.391992 2053190 cache.go:68] Finished verifying existence of preloaded tar for v1.35.0-rc.1 on containerd
	I1219 05:57:28.392329 2053190 profile.go:143] Saving config to /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/config.json ...
	I1219 05:57:28.392348 2053190 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/config.json: {Name:mk3e91540958343da8859e571edadad15904c756 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1219 05:57:28.410985 2053190 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon, skipping pull
	I1219 05:57:28.411001 2053190 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 exists in daemon, skipping load
	I1219 05:57:28.411031 2053190 cache.go:243] Successfully downloaded all kic artifacts
	I1219 05:57:28.411059 2053190 start.go:360] acquireMachinesLock for functional-006924: {Name:mkc84f48e83d18024791d45db780f3ccd746613a Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1219 05:57:28.411173 2053190 start.go:364] duration metric: took 100.145µs to acquireMachinesLock for "functional-006924"
	I1219 05:57:28.411204 2053190 start.go:93] Provisioning new machine with config: &{Name:functional-006924 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APIS
erverName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:
false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I1219 05:57:28.411280 2053190 start.go:125] createHost starting for "" (driver="docker")
	I1219 05:57:28.414609 2053190 out.go:252] * Creating docker container (CPUs=2, Memory=4096MB) ...
	W1219 05:57:28.414881 2053190 out.go:285] ! Local proxy ignored: not passing HTTP_PROXY=localhost:39775 to docker env.
	I1219 05:57:28.414905 2053190 start.go:159] libmachine.API.Create for "functional-006924" (driver="docker")
	I1219 05:57:28.414932 2053190 client.go:173] LocalClient.Create starting
	I1219 05:57:28.414991 2053190 main.go:144] libmachine: Reading certificate data from /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem
	I1219 05:57:28.415021 2053190 main.go:144] libmachine: Decoding PEM data...
	I1219 05:57:28.415038 2053190 main.go:144] libmachine: Parsing certificate...
	I1219 05:57:28.415086 2053190 main.go:144] libmachine: Reading certificate data from /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/cert.pem
	I1219 05:57:28.415104 2053190 main.go:144] libmachine: Decoding PEM data...
	I1219 05:57:28.415114 2053190 main.go:144] libmachine: Parsing certificate...
	I1219 05:57:28.415476 2053190 cli_runner.go:164] Run: docker network inspect functional-006924 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	W1219 05:57:28.430914 2053190 cli_runner.go:211] docker network inspect functional-006924 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}" returned with exit code 1
	I1219 05:57:28.431004 2053190 network_create.go:284] running [docker network inspect functional-006924] to gather additional debugging logs...
	I1219 05:57:28.431019 2053190 cli_runner.go:164] Run: docker network inspect functional-006924
	W1219 05:57:28.446263 2053190 cli_runner.go:211] docker network inspect functional-006924 returned with exit code 1
	I1219 05:57:28.446283 2053190 network_create.go:287] error running [docker network inspect functional-006924]: docker network inspect functional-006924: exit status 1
	stdout:
	[]
	
	stderr:
	Error response from daemon: network functional-006924 not found
	I1219 05:57:28.446310 2053190 network_create.go:289] output of [docker network inspect functional-006924]: -- stdout --
	[]
	
	-- /stdout --
	** stderr ** 
	Error response from daemon: network functional-006924 not found
	
	** /stderr **
	I1219 05:57:28.446408 2053190 cli_runner.go:164] Run: docker network inspect bridge --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1219 05:57:28.465124 2053190 network.go:206] using free private subnet 192.168.49.0/24: &{IP:192.168.49.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.49.0/24 Gateway:192.168.49.1 ClientMin:192.168.49.2 ClientMax:192.168.49.254 Broadcast:192.168.49.255 IsPrivate:true Interface:{IfaceName: IfaceIPv4: IfaceMTU:0 IfaceMAC:} reservation:0x4001992010}
	I1219 05:57:28.465154 2053190 network_create.go:124] attempt to create docker network functional-006924 192.168.49.0/24 with gateway 192.168.49.1 and MTU of 1500 ...
	I1219 05:57:28.465211 2053190 cli_runner.go:164] Run: docker network create --driver=bridge --subnet=192.168.49.0/24 --gateway=192.168.49.1 -o --ip-masq -o --icc -o com.docker.network.driver.mtu=1500 --label=created_by.minikube.sigs.k8s.io=true --label=name.minikube.sigs.k8s.io=functional-006924 functional-006924
	I1219 05:57:28.524721 2053190 network_create.go:108] docker network functional-006924 192.168.49.0/24 created
	I1219 05:57:28.524747 2053190 kic.go:121] calculated static IP "192.168.49.2" for the "functional-006924" container
	I1219 05:57:28.524885 2053190 cli_runner.go:164] Run: docker ps -a --format {{.Names}}
	I1219 05:57:28.539703 2053190 cli_runner.go:164] Run: docker volume create functional-006924 --label name.minikube.sigs.k8s.io=functional-006924 --label created_by.minikube.sigs.k8s.io=true
	I1219 05:57:28.558237 2053190 oci.go:103] Successfully created a docker volume functional-006924
	I1219 05:57:28.558332 2053190 cli_runner.go:164] Run: docker run --rm --name functional-006924-preload-sidecar --label created_by.minikube.sigs.k8s.io=true --label name.minikube.sigs.k8s.io=functional-006924 --entrypoint /usr/bin/test -v functional-006924:/var gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 -d /var/lib
	I1219 05:57:29.070110 2053190 oci.go:107] Successfully prepared a docker volume functional-006924
	I1219 05:57:29.070177 2053190 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1219 05:57:29.070185 2053190 kic.go:194] Starting extracting preloaded images to volume ...
	I1219 05:57:29.070258 2053190 cli_runner.go:164] Run: docker run --rm --entrypoint /usr/bin/tar -v /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4:/preloaded.tar:ro -v functional-006924:/extractDir gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 -I lz4 -xf /preloaded.tar -C /extractDir
	I1219 05:57:32.921851 2053190 cli_runner.go:217] Completed: docker run --rm --entrypoint /usr/bin/tar -v /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4:/preloaded.tar:ro -v functional-006924:/extractDir gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 -I lz4 -xf /preloaded.tar -C /extractDir: (3.851551995s)
	I1219 05:57:32.921880 2053190 kic.go:203] duration metric: took 3.851691639s to extract preloaded images to volume ...
	W1219 05:57:32.922003 2053190 cgroups_linux.go:77] Your kernel does not support swap limit capabilities or the cgroup is not mounted.
	I1219 05:57:32.922108 2053190 cli_runner.go:164] Run: docker info --format "'{{json .SecurityOptions}}'"
	I1219 05:57:32.973224 2053190 cli_runner.go:164] Run: docker run -d -t --privileged --security-opt seccomp=unconfined --tmpfs /tmp --tmpfs /run -v /lib/modules:/lib/modules:ro --hostname functional-006924 --name functional-006924 --label created_by.minikube.sigs.k8s.io=true --label name.minikube.sigs.k8s.io=functional-006924 --label role.minikube.sigs.k8s.io= --label mode.minikube.sigs.k8s.io=functional-006924 --network functional-006924 --ip 192.168.49.2 --volume functional-006924:/var --security-opt apparmor=unconfined --memory=4096mb --cpus=2 -e container=docker --expose 8441 --publish=127.0.0.1::8441 --publish=127.0.0.1::22 --publish=127.0.0.1::2376 --publish=127.0.0.1::5000 --publish=127.0.0.1::32443 gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0
	I1219 05:57:33.289746 2053190 cli_runner.go:164] Run: docker container inspect functional-006924 --format={{.State.Running}}
	I1219 05:57:33.311237 2053190 cli_runner.go:164] Run: docker container inspect functional-006924 --format={{.State.Status}}
	I1219 05:57:33.334015 2053190 cli_runner.go:164] Run: docker exec functional-006924 stat /var/lib/dpkg/alternatives/iptables
	I1219 05:57:33.383388 2053190 oci.go:144] the created container "functional-006924" has a running status.
	I1219 05:57:33.383415 2053190 kic.go:225] Creating ssh key for kic: /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa...
	I1219 05:57:33.631027 2053190 kic_runner.go:191] docker (temp): /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa.pub --> /home/docker/.ssh/authorized_keys (381 bytes)
	I1219 05:57:33.657051 2053190 cli_runner.go:164] Run: docker container inspect functional-006924 --format={{.State.Status}}
	I1219 05:57:33.679295 2053190 kic_runner.go:93] Run: chown docker:docker /home/docker/.ssh/authorized_keys
	I1219 05:57:33.679306 2053190 kic_runner.go:114] Args: [docker exec --privileged functional-006924 chown docker:docker /home/docker/.ssh/authorized_keys]
	I1219 05:57:33.743179 2053190 cli_runner.go:164] Run: docker container inspect functional-006924 --format={{.State.Status}}
	I1219 05:57:33.777752 2053190 machine.go:94] provisionDockerMachine start ...
	I1219 05:57:33.777836 2053190 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 05:57:33.805420 2053190 main.go:144] libmachine: Using SSH client type: native
	I1219 05:57:33.805758 2053190 main.go:144] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db5e0] 0x3ddae0 <nil>  [] 0s} 127.0.0.1 34704 <nil> <nil>}
	I1219 05:57:33.805765 2053190 main.go:144] libmachine: About to run SSH command:
	hostname
	I1219 05:57:33.806312 2053190 main.go:144] libmachine: Error dialing TCP: ssh: handshake failed: read tcp 127.0.0.1:40736->127.0.0.1:34704: read: connection reset by peer
	I1219 05:57:36.964670 2053190 main.go:144] libmachine: SSH cmd err, output: <nil>: functional-006924
	
	I1219 05:57:36.964685 2053190 ubuntu.go:182] provisioning hostname "functional-006924"
	I1219 05:57:36.964784 2053190 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 05:57:36.982235 2053190 main.go:144] libmachine: Using SSH client type: native
	I1219 05:57:36.982546 2053190 main.go:144] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db5e0] 0x3ddae0 <nil>  [] 0s} 127.0.0.1 34704 <nil> <nil>}
	I1219 05:57:36.982555 2053190 main.go:144] libmachine: About to run SSH command:
	sudo hostname functional-006924 && echo "functional-006924" | sudo tee /etc/hostname
	I1219 05:57:37.150675 2053190 main.go:144] libmachine: SSH cmd err, output: <nil>: functional-006924
	
	I1219 05:57:37.150757 2053190 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 05:57:37.168539 2053190 main.go:144] libmachine: Using SSH client type: native
	I1219 05:57:37.168893 2053190 main.go:144] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db5e0] 0x3ddae0 <nil>  [] 0s} 127.0.0.1 34704 <nil> <nil>}
	I1219 05:57:37.168907 2053190 main.go:144] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-006924' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-006924/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-006924' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1219 05:57:37.321072 2053190 main.go:144] libmachine: SSH cmd err, output: <nil>: 
	I1219 05:57:37.321087 2053190 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/22230-1998525/.minikube CaCertPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/22230-1998525/.minikube}
	I1219 05:57:37.321106 2053190 ubuntu.go:190] setting up certificates
	I1219 05:57:37.321115 2053190 provision.go:84] configureAuth start
	I1219 05:57:37.321184 2053190 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-006924
	I1219 05:57:37.338803 2053190 provision.go:143] copyHostCerts
	I1219 05:57:37.338868 2053190 exec_runner.go:144] found /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.pem, removing ...
	I1219 05:57:37.338876 2053190 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.pem
	I1219 05:57:37.338956 2053190 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.pem (1078 bytes)
	I1219 05:57:37.339041 2053190 exec_runner.go:144] found /home/jenkins/minikube-integration/22230-1998525/.minikube/cert.pem, removing ...
	I1219 05:57:37.339050 2053190 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22230-1998525/.minikube/cert.pem
	I1219 05:57:37.339076 2053190 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/22230-1998525/.minikube/cert.pem (1123 bytes)
	I1219 05:57:37.339127 2053190 exec_runner.go:144] found /home/jenkins/minikube-integration/22230-1998525/.minikube/key.pem, removing ...
	I1219 05:57:37.339131 2053190 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22230-1998525/.minikube/key.pem
	I1219 05:57:37.339152 2053190 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/22230-1998525/.minikube/key.pem (1671 bytes)
	I1219 05:57:37.339195 2053190 provision.go:117] generating server cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca-key.pem org=jenkins.functional-006924 san=[127.0.0.1 192.168.49.2 functional-006924 localhost minikube]
	I1219 05:57:38.009465 2053190 provision.go:177] copyRemoteCerts
	I1219 05:57:38.009543 2053190 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1219 05:57:38.009587 2053190 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 05:57:38.028609 2053190 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 05:57:38.139367 2053190 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1219 05:57:38.156968 2053190 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1219 05:57:38.175435 2053190 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I1219 05:57:38.193298 2053190 provision.go:87] duration metric: took 872.159123ms to configureAuth
	I1219 05:57:38.193316 2053190 ubuntu.go:206] setting minikube options for container-runtime
	I1219 05:57:38.193542 2053190 config.go:182] Loaded profile config "functional-006924": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1219 05:57:38.193548 2053190 machine.go:97] duration metric: took 4.415787388s to provisionDockerMachine
	I1219 05:57:38.193555 2053190 client.go:176] duration metric: took 9.778618727s to LocalClient.Create
	I1219 05:57:38.193577 2053190 start.go:167] duration metric: took 9.77867207s to libmachine.API.Create "functional-006924"
	I1219 05:57:38.193582 2053190 start.go:293] postStartSetup for "functional-006924" (driver="docker")
	I1219 05:57:38.193592 2053190 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1219 05:57:38.193649 2053190 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1219 05:57:38.193687 2053190 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 05:57:38.210841 2053190 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 05:57:38.317258 2053190 ssh_runner.go:195] Run: cat /etc/os-release
	I1219 05:57:38.320668 2053190 main.go:144] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1219 05:57:38.320685 2053190 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1219 05:57:38.320697 2053190 filesync.go:126] Scanning /home/jenkins/minikube-integration/22230-1998525/.minikube/addons for local assets ...
	I1219 05:57:38.320751 2053190 filesync.go:126] Scanning /home/jenkins/minikube-integration/22230-1998525/.minikube/files for local assets ...
	I1219 05:57:38.320876 2053190 filesync.go:149] local asset: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem -> 20003862.pem in /etc/ssl/certs
	I1219 05:57:38.320955 2053190 filesync.go:149] local asset: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/test/nested/copy/2000386/hosts -> hosts in /etc/test/nested/copy/2000386
	I1219 05:57:38.321006 2053190 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/2000386
	I1219 05:57:38.328957 2053190 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem --> /etc/ssl/certs/20003862.pem (1708 bytes)
	I1219 05:57:38.347434 2053190 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/test/nested/copy/2000386/hosts --> /etc/test/nested/copy/2000386/hosts (40 bytes)
	I1219 05:57:38.366026 2053190 start.go:296] duration metric: took 172.429001ms for postStartSetup
	I1219 05:57:38.366395 2053190 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-006924
	I1219 05:57:38.384973 2053190 profile.go:143] Saving config to /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/config.json ...
	I1219 05:57:38.385248 2053190 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1219 05:57:38.385294 2053190 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 05:57:38.402917 2053190 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 05:57:38.505885 2053190 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1219 05:57:38.510660 2053190 start.go:128] duration metric: took 10.099366615s to createHost
	I1219 05:57:38.510676 2053190 start.go:83] releasing machines lock for "functional-006924", held for 10.099494961s
	I1219 05:57:38.510757 2053190 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-006924
	I1219 05:57:38.533325 2053190 out.go:179] * Found network options:
	I1219 05:57:38.536247 2053190 out.go:179]   - HTTP_PROXY=localhost:39775
	W1219 05:57:38.539135 2053190 out.go:285] ! You appear to be using a proxy, but your NO_PROXY environment does not include the minikube IP (192.168.49.2).
	I1219 05:57:38.542005 2053190 out.go:179] * Please see https://minikube.sigs.k8s.io/docs/handbook/vpn_and_proxy/ for more details
	I1219 05:57:38.544961 2053190 ssh_runner.go:195] Run: cat /version.json
	I1219 05:57:38.545002 2053190 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 05:57:38.545025 2053190 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1219 05:57:38.545076 2053190 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 05:57:38.568254 2053190 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 05:57:38.569443 2053190 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 05:57:38.762300 2053190 ssh_runner.go:195] Run: systemctl --version
	I1219 05:57:38.768615 2053190 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1219 05:57:38.772855 2053190 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1219 05:57:38.772916 2053190 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1219 05:57:38.799398 2053190 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist, /etc/cni/net.d/10-crio-bridge.conflist.disabled] bridge cni config(s)
	I1219 05:57:38.799410 2053190 start.go:496] detecting cgroup driver to use...
	I1219 05:57:38.799440 2053190 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1219 05:57:38.799489 2053190 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1219 05:57:38.814469 2053190 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1219 05:57:38.827580 2053190 docker.go:218] disabling cri-docker service (if available) ...
	I1219 05:57:38.827634 2053190 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1219 05:57:38.845067 2053190 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1219 05:57:38.864167 2053190 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1219 05:57:38.982132 2053190 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1219 05:57:39.115031 2053190 docker.go:234] disabling docker service ...
	I1219 05:57:39.115108 2053190 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1219 05:57:39.137000 2053190 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1219 05:57:39.149992 2053190 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1219 05:57:39.271719 2053190 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1219 05:57:39.396126 2053190 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1219 05:57:39.409558 2053190 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1219 05:57:39.424602 2053190 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1219 05:57:39.433967 2053190 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1219 05:57:39.443204 2053190 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1219 05:57:39.443265 2053190 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1219 05:57:39.452640 2053190 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1219 05:57:39.462230 2053190 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1219 05:57:39.470845 2053190 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1219 05:57:39.479487 2053190 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1219 05:57:39.487623 2053190 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1219 05:57:39.496141 2053190 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1219 05:57:39.504672 2053190 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1219 05:57:39.514186 2053190 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1219 05:57:39.521908 2053190 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1219 05:57:39.529500 2053190 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1219 05:57:39.638647 2053190 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1219 05:57:39.771673 2053190 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1219 05:57:39.771738 2053190 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1219 05:57:39.775554 2053190 start.go:564] Will wait 60s for crictl version
	I1219 05:57:39.775607 2053190 ssh_runner.go:195] Run: which crictl
	I1219 05:57:39.779022 2053190 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1219 05:57:39.803248 2053190 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.2.0
	RuntimeApiVersion:  v1
	I1219 05:57:39.803308 2053190 ssh_runner.go:195] Run: containerd --version
	I1219 05:57:39.824649 2053190 ssh_runner.go:195] Run: containerd --version
	I1219 05:57:39.849617 2053190 out.go:179] * Preparing Kubernetes v1.35.0-rc.1 on containerd 2.2.0 ...
	I1219 05:57:39.852525 2053190 cli_runner.go:164] Run: docker network inspect functional-006924 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1219 05:57:39.868466 2053190 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1219 05:57:39.872427 2053190 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.49.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I1219 05:57:39.882237 2053190 kubeadm.go:884] updating cluster {Name:functional-006924 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APIServerName:minikubeC
A APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false C
ustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1219 05:57:39.882356 2053190 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1219 05:57:39.882417 2053190 ssh_runner.go:195] Run: sudo crictl images --output json
	I1219 05:57:39.907763 2053190 containerd.go:627] all images are preloaded for containerd runtime.
	I1219 05:57:39.907774 2053190 containerd.go:534] Images already preloaded, skipping extraction
	I1219 05:57:39.907833 2053190 ssh_runner.go:195] Run: sudo crictl images --output json
	I1219 05:57:39.931815 2053190 containerd.go:627] all images are preloaded for containerd runtime.
	I1219 05:57:39.931827 2053190 cache_images.go:86] Images are preloaded, skipping loading
	I1219 05:57:39.931833 2053190 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-rc.1 containerd true true} ...
	I1219 05:57:39.931930 2053190 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-rc.1/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-006924 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1219 05:57:39.931999 2053190 ssh_runner.go:195] Run: sudo crictl --timeout=10s info
	I1219 05:57:39.956796 2053190 cni.go:84] Creating CNI manager for ""
	I1219 05:57:39.956804 2053190 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1219 05:57:39.956819 2053190 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1219 05:57:39.956841 2053190 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-rc.1 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-006924 NodeName:functional-006924 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt Stati
cPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1219 05:57:39.956951 2053190 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-006924"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-rc.1
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1219 05:57:39.957024 2053190 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-rc.1
	I1219 05:57:39.964905 2053190 binaries.go:51] Found k8s binaries, skipping transfer
	I1219 05:57:39.964972 2053190 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1219 05:57:39.972887 2053190 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (326 bytes)
	I1219 05:57:39.985963 2053190 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (357 bytes)
	I1219 05:57:39.999267 2053190 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2235 bytes)
	I1219 05:57:40.017744 2053190 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1219 05:57:40.022879 2053190 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.49.2	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I1219 05:57:40.034959 2053190 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1219 05:57:40.150027 2053190 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1219 05:57:40.165945 2053190 certs.go:69] Setting up /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924 for IP: 192.168.49.2
	I1219 05:57:40.165956 2053190 certs.go:195] generating shared ca certs ...
	I1219 05:57:40.165971 2053190 certs.go:227] acquiring lock for ca certs: {Name:mk382c71693ea4061363f97b153b21bf6cdf5f38 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1219 05:57:40.166119 2053190 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.key
	I1219 05:57:40.166160 2053190 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/proxy-client-ca.key
	I1219 05:57:40.166165 2053190 certs.go:257] generating profile certs ...
	I1219 05:57:40.166222 2053190 certs.go:364] generating signed profile cert for "minikube-user": /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.key
	I1219 05:57:40.166231 2053190 crypto.go:68] Generating cert /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.crt with IP's: []
	I1219 05:57:40.648334 2053190 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.crt ...
	I1219 05:57:40.648351 2053190 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.crt: {Name:mka5389e1cc883d30ee9be689670e03d0f7966ba Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1219 05:57:40.648554 2053190 crypto.go:164] Writing key to /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.key ...
	I1219 05:57:40.648560 2053190 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.key: {Name:mk6e5d4d217f4a985c359bfa02e00c95dc52f64e Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1219 05:57:40.648654 2053190 certs.go:364] generating signed profile cert for "minikube": /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.key.febe6fed
	I1219 05:57:40.648665 2053190 crypto.go:68] Generating cert /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.crt.febe6fed with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.168.49.2]
	I1219 05:57:40.918619 2053190 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.crt.febe6fed ...
	I1219 05:57:40.918643 2053190 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.crt.febe6fed: {Name:mkc071bb89a2fe22ecf9939a6709163920f68aa7 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1219 05:57:40.918852 2053190 crypto.go:164] Writing key to /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.key.febe6fed ...
	I1219 05:57:40.918861 2053190 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.key.febe6fed: {Name:mkecdbb5c8f6e97a0f1bd58e913a88ae3cdb0864 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1219 05:57:40.918953 2053190 certs.go:382] copying /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.crt.febe6fed -> /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.crt
	I1219 05:57:40.919033 2053190 certs.go:386] copying /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.key.febe6fed -> /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.key
	I1219 05:57:40.919083 2053190 certs.go:364] generating signed profile cert for "aggregator": /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/proxy-client.key
	I1219 05:57:40.919094 2053190 crypto.go:68] Generating cert /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/proxy-client.crt with IP's: []
	I1219 05:57:41.113880 2053190 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/proxy-client.crt ...
	I1219 05:57:41.113896 2053190 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/proxy-client.crt: {Name:mk13b69bc6e122d2d46dd86d11dbd2009ca4eb86 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1219 05:57:41.114092 2053190 crypto.go:164] Writing key to /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/proxy-client.key ...
	I1219 05:57:41.114100 2053190 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/proxy-client.key: {Name:mk5ff9b1f25460f8d9be7bf8af9aa07115ad764b Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1219 05:57:41.114297 2053190 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/2000386.pem (1338 bytes)
	W1219 05:57:41.114336 2053190 certs.go:480] ignoring /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/2000386_empty.pem, impossibly tiny 0 bytes
	I1219 05:57:41.114344 2053190 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca-key.pem (1679 bytes)
	I1219 05:57:41.114370 2053190 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem (1078 bytes)
	I1219 05:57:41.114394 2053190 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/cert.pem (1123 bytes)
	I1219 05:57:41.114418 2053190 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/key.pem (1671 bytes)
	I1219 05:57:41.114462 2053190 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem (1708 bytes)
	I1219 05:57:41.115066 2053190 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1219 05:57:41.135318 2053190 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1219 05:57:41.153721 2053190 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1219 05:57:41.171696 2053190 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I1219 05:57:41.190087 2053190 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1219 05:57:41.207125 2053190 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I1219 05:57:41.224908 2053190 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1219 05:57:41.242943 2053190 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I1219 05:57:41.260503 2053190 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1219 05:57:41.278081 2053190 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/2000386.pem --> /usr/share/ca-certificates/2000386.pem (1338 bytes)
	I1219 05:57:41.295717 2053190 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem --> /usr/share/ca-certificates/20003862.pem (1708 bytes)
	I1219 05:57:41.314164 2053190 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (722 bytes)
	I1219 05:57:41.327320 2053190 ssh_runner.go:195] Run: openssl version
	I1219 05:57:41.333634 2053190 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1219 05:57:41.340650 2053190 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1219 05:57:41.347815 2053190 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1219 05:57:41.351478 2053190 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 19 05:43 /usr/share/ca-certificates/minikubeCA.pem
	I1219 05:57:41.351533 2053190 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1219 05:57:41.392241 2053190 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1219 05:57:41.400235 2053190 ssh_runner.go:195] Run: sudo ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0
	I1219 05:57:41.408027 2053190 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/2000386.pem
	I1219 05:57:41.415285 2053190 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/2000386.pem /etc/ssl/certs/2000386.pem
	I1219 05:57:41.422931 2053190 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/2000386.pem
	I1219 05:57:41.426435 2053190 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 19 05:57 /usr/share/ca-certificates/2000386.pem
	I1219 05:57:41.426488 2053190 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/2000386.pem
	I1219 05:57:41.467541 2053190 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1219 05:57:41.474744 2053190 ssh_runner.go:195] Run: sudo ln -fs /etc/ssl/certs/2000386.pem /etc/ssl/certs/51391683.0
	I1219 05:57:41.481623 2053190 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/20003862.pem
	I1219 05:57:41.488382 2053190 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/20003862.pem /etc/ssl/certs/20003862.pem
	I1219 05:57:41.495466 2053190 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/20003862.pem
	I1219 05:57:41.499044 2053190 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 19 05:57 /usr/share/ca-certificates/20003862.pem
	I1219 05:57:41.499099 2053190 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/20003862.pem
	I1219 05:57:41.540424 2053190 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1219 05:57:41.548215 2053190 ssh_runner.go:195] Run: sudo ln -fs /etc/ssl/certs/20003862.pem /etc/ssl/certs/3ec20f2e.0
	I1219 05:57:41.555804 2053190 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1219 05:57:41.559340 2053190 certs.go:400] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I1219 05:57:41.559392 2053190 kubeadm.go:401] StartCluster: {Name:functional-006924 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APIServerName:minikubeCA A
PIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false Cust
omQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1219 05:57:41.559459 2053190 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1219 05:57:41.559521 2053190 ssh_runner.go:195] Run: sudo -s eval "crictl --timeout=10s ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1219 05:57:41.585076 2053190 cri.go:92] found id: ""
	I1219 05:57:41.585144 2053190 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1219 05:57:41.592969 2053190 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1219 05:57:41.600876 2053190 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1219 05:57:41.600936 2053190 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1219 05:57:41.608775 2053190 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1219 05:57:41.608795 2053190 kubeadm.go:158] found existing configuration files:
	
	I1219 05:57:41.608849 2053190 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1219 05:57:41.616646 2053190 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1219 05:57:41.616706 2053190 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1219 05:57:41.624253 2053190 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1219 05:57:41.632115 2053190 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1219 05:57:41.632171 2053190 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1219 05:57:41.639758 2053190 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1219 05:57:41.647795 2053190 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1219 05:57:41.647850 2053190 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1219 05:57:41.657813 2053190 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1219 05:57:41.666888 2053190 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1219 05:57:41.666949 2053190 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1219 05:57:41.676120 2053190 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1219 05:57:41.721913 2053190 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-rc.1
	I1219 05:57:41.722219 2053190 kubeadm.go:319] [preflight] Running pre-flight checks
	I1219 05:57:41.797359 2053190 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1219 05:57:41.797422 2053190 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1219 05:57:41.797456 2053190 kubeadm.go:319] OS: Linux
	I1219 05:57:41.797508 2053190 kubeadm.go:319] CGROUPS_CPU: enabled
	I1219 05:57:41.797555 2053190 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1219 05:57:41.797601 2053190 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1219 05:57:41.797648 2053190 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1219 05:57:41.797695 2053190 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1219 05:57:41.797750 2053190 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1219 05:57:41.797794 2053190 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1219 05:57:41.797840 2053190 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1219 05:57:41.797885 2053190 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1219 05:57:41.867217 2053190 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1219 05:57:41.867320 2053190 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1219 05:57:41.867409 2053190 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1219 05:57:41.885157 2053190 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1219 05:57:41.891596 2053190 out.go:252]   - Generating certificates and keys ...
	I1219 05:57:41.891694 2053190 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1219 05:57:41.891760 2053190 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1219 05:57:42.202975 2053190 kubeadm.go:319] [certs] Generating "apiserver-kubelet-client" certificate and key
	I1219 05:57:42.405335 2053190 kubeadm.go:319] [certs] Generating "front-proxy-ca" certificate and key
	I1219 05:57:43.067413 2053190 kubeadm.go:319] [certs] Generating "front-proxy-client" certificate and key
	I1219 05:57:43.779919 2053190 kubeadm.go:319] [certs] Generating "etcd/ca" certificate and key
	I1219 05:57:43.883437 2053190 kubeadm.go:319] [certs] Generating "etcd/server" certificate and key
	I1219 05:57:43.883566 2053190 kubeadm.go:319] [certs] etcd/server serving cert is signed for DNS names [functional-006924 localhost] and IPs [192.168.49.2 127.0.0.1 ::1]
	I1219 05:57:44.038179 2053190 kubeadm.go:319] [certs] Generating "etcd/peer" certificate and key
	I1219 05:57:44.038497 2053190 kubeadm.go:319] [certs] etcd/peer serving cert is signed for DNS names [functional-006924 localhost] and IPs [192.168.49.2 127.0.0.1 ::1]
	I1219 05:57:44.397141 2053190 kubeadm.go:319] [certs] Generating "etcd/healthcheck-client" certificate and key
	I1219 05:57:44.534160 2053190 kubeadm.go:319] [certs] Generating "apiserver-etcd-client" certificate and key
	I1219 05:57:45.074710 2053190 kubeadm.go:319] [certs] Generating "sa" key and public key
	I1219 05:57:45.075297 2053190 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1219 05:57:45.297375 2053190 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1219 05:57:45.797119 2053190 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1219 05:57:46.132613 2053190 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1219 05:57:46.185225 2053190 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1219 05:57:46.402200 2053190 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1219 05:57:46.402823 2053190 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1219 05:57:46.406892 2053190 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1219 05:57:46.412710 2053190 out.go:252]   - Booting up control plane ...
	I1219 05:57:46.412835 2053190 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1219 05:57:46.412912 2053190 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1219 05:57:46.412978 2053190 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1219 05:57:46.429853 2053190 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1219 05:57:46.430111 2053190 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1219 05:57:46.437914 2053190 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1219 05:57:46.438191 2053190 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1219 05:57:46.438366 2053190 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1219 05:57:46.569269 2053190 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1219 05:57:46.569383 2053190 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1219 06:01:46.565693 2053190 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.001039228s
	I1219 06:01:46.565720 2053190 kubeadm.go:319] 
	I1219 06:01:46.565823 2053190 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1219 06:01:46.565935 2053190 kubeadm.go:319] 	- The kubelet is not running
	I1219 06:01:46.566400 2053190 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1219 06:01:46.566416 2053190 kubeadm.go:319] 
	I1219 06:01:46.566597 2053190 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1219 06:01:46.566649 2053190 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1219 06:01:46.566700 2053190 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1219 06:01:46.566704 2053190 kubeadm.go:319] 
	I1219 06:01:46.571342 2053190 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1219 06:01:46.571755 2053190 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1219 06:01:46.571863 2053190 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1219 06:01:46.572096 2053190 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1219 06:01:46.572100 2053190 kubeadm.go:319] 
	I1219 06:01:46.572167 2053190 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	W1219 06:01:46.572292 2053190 out.go:285] ! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Generating "apiserver-kubelet-client" certificate and key
	[certs] Generating "front-proxy-ca" certificate and key
	[certs] Generating "front-proxy-client" certificate and key
	[certs] Generating "etcd/ca" certificate and key
	[certs] Generating "etcd/server" certificate and key
	[certs] etcd/server serving cert is signed for DNS names [functional-006924 localhost] and IPs [192.168.49.2 127.0.0.1 ::1]
	[certs] Generating "etcd/peer" certificate and key
	[certs] etcd/peer serving cert is signed for DNS names [functional-006924 localhost] and IPs [192.168.49.2 127.0.0.1 ::1]
	[certs] Generating "etcd/healthcheck-client" certificate and key
	[certs] Generating "apiserver-etcd-client" certificate and key
	[certs] Generating "sa" key and public key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001039228s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	I1219 06:01:46.572374 2053190 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1219 06:01:46.980554 2053190 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1219 06:01:46.994058 2053190 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1219 06:01:46.994115 2053190 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1219 06:01:47.002687 2053190 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1219 06:01:47.002710 2053190 kubeadm.go:158] found existing configuration files:
	
	I1219 06:01:47.002777 2053190 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1219 06:01:47.013794 2053190 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1219 06:01:47.013874 2053190 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1219 06:01:47.021858 2053190 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1219 06:01:47.029982 2053190 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1219 06:01:47.030045 2053190 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1219 06:01:47.037784 2053190 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1219 06:01:47.046069 2053190 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1219 06:01:47.046127 2053190 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1219 06:01:47.053782 2053190 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1219 06:01:47.061867 2053190 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1219 06:01:47.061929 2053190 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1219 06:01:47.069689 2053190 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1219 06:01:47.108150 2053190 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-rc.1
	I1219 06:01:47.108199 2053190 kubeadm.go:319] [preflight] Running pre-flight checks
	I1219 06:01:47.175527 2053190 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1219 06:01:47.175591 2053190 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1219 06:01:47.175625 2053190 kubeadm.go:319] OS: Linux
	I1219 06:01:47.175668 2053190 kubeadm.go:319] CGROUPS_CPU: enabled
	I1219 06:01:47.175715 2053190 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1219 06:01:47.175761 2053190 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1219 06:01:47.175808 2053190 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1219 06:01:47.175854 2053190 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1219 06:01:47.175901 2053190 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1219 06:01:47.175945 2053190 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1219 06:01:47.175999 2053190 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1219 06:01:47.176045 2053190 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1219 06:01:47.243286 2053190 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1219 06:01:47.243390 2053190 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1219 06:01:47.243506 2053190 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1219 06:01:47.248210 2053190 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1219 06:01:47.253554 2053190 out.go:252]   - Generating certificates and keys ...
	I1219 06:01:47.253650 2053190 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1219 06:01:47.253714 2053190 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1219 06:01:47.253813 2053190 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1219 06:01:47.253891 2053190 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1219 06:01:47.253970 2053190 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1219 06:01:47.254034 2053190 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1219 06:01:47.254104 2053190 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1219 06:01:47.254172 2053190 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1219 06:01:47.254266 2053190 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1219 06:01:47.254339 2053190 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1219 06:01:47.254378 2053190 kubeadm.go:319] [certs] Using the existing "sa" key
	I1219 06:01:47.254455 2053190 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1219 06:01:47.467528 2053190 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1219 06:01:47.582829 2053190 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1219 06:01:47.828521 2053190 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1219 06:01:47.925916 2053190 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1219 06:01:48.409670 2053190 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1219 06:01:48.410211 2053190 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1219 06:01:48.413014 2053190 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1219 06:01:48.416152 2053190 out.go:252]   - Booting up control plane ...
	I1219 06:01:48.416252 2053190 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1219 06:01:48.416334 2053190 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1219 06:01:48.416991 2053190 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1219 06:01:48.437177 2053190 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1219 06:01:48.437279 2053190 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1219 06:01:48.447290 2053190 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1219 06:01:48.447582 2053190 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1219 06:01:48.447769 2053190 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1219 06:01:48.581266 2053190 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1219 06:01:48.581388 2053190 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1219 06:05:48.581790 2053190 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.000326287s
	I1219 06:05:48.581820 2053190 kubeadm.go:319] 
	I1219 06:05:48.581992 2053190 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1219 06:05:48.582064 2053190 kubeadm.go:319] 	- The kubelet is not running
	I1219 06:05:48.582472 2053190 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1219 06:05:48.582480 2053190 kubeadm.go:319] 
	I1219 06:05:48.582691 2053190 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1219 06:05:48.582757 2053190 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1219 06:05:48.582818 2053190 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1219 06:05:48.582824 2053190 kubeadm.go:319] 
	I1219 06:05:48.588085 2053190 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1219 06:05:48.588581 2053190 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1219 06:05:48.588698 2053190 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1219 06:05:48.588986 2053190 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1219 06:05:48.588999 2053190 kubeadm.go:319] 
	I1219 06:05:48.589080 2053190 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1219 06:05:48.589149 2053190 kubeadm.go:403] duration metric: took 8m7.029771032s to StartCluster
	I1219 06:05:48.589183 2053190 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:05:48.589250 2053190 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:05:48.613117 2053190 cri.go:92] found id: ""
	I1219 06:05:48.613141 2053190 logs.go:282] 0 containers: []
	W1219 06:05:48.613148 2053190 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:05:48.613155 2053190 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:05:48.613219 2053190 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:05:48.638004 2053190 cri.go:92] found id: ""
	I1219 06:05:48.638029 2053190 logs.go:282] 0 containers: []
	W1219 06:05:48.638037 2053190 logs.go:284] No container was found matching "etcd"
	I1219 06:05:48.638042 2053190 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:05:48.638114 2053190 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:05:48.675820 2053190 cri.go:92] found id: ""
	I1219 06:05:48.675834 2053190 logs.go:282] 0 containers: []
	W1219 06:05:48.675841 2053190 logs.go:284] No container was found matching "coredns"
	I1219 06:05:48.675846 2053190 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:05:48.675905 2053190 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:05:48.707053 2053190 cri.go:92] found id: ""
	I1219 06:05:48.707067 2053190 logs.go:282] 0 containers: []
	W1219 06:05:48.707075 2053190 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:05:48.707080 2053190 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:05:48.707139 2053190 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:05:48.736642 2053190 cri.go:92] found id: ""
	I1219 06:05:48.736656 2053190 logs.go:282] 0 containers: []
	W1219 06:05:48.736663 2053190 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:05:48.736668 2053190 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:05:48.736745 2053190 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:05:48.761685 2053190 cri.go:92] found id: ""
	I1219 06:05:48.761699 2053190 logs.go:282] 0 containers: []
	W1219 06:05:48.761706 2053190 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:05:48.761712 2053190 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:05:48.761773 2053190 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:05:48.786402 2053190 cri.go:92] found id: ""
	I1219 06:05:48.786417 2053190 logs.go:282] 0 containers: []
	W1219 06:05:48.786424 2053190 logs.go:284] No container was found matching "kindnet"
	I1219 06:05:48.786434 2053190 logs.go:123] Gathering logs for kubelet ...
	I1219 06:05:48.786444 2053190 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:05:48.841739 2053190 logs.go:123] Gathering logs for dmesg ...
	I1219 06:05:48.841758 2053190 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:05:48.859224 2053190 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:05:48.859240 2053190 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:05:48.928056 2053190 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:05:48.918418    4805 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:05:48.919168    4805 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:05:48.920962    4805 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:05:48.921619    4805 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:05:48.923169    4805 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:05:48.918418    4805 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:05:48.919168    4805 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:05:48.920962    4805 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:05:48.921619    4805 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:05:48.923169    4805 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:05:48.928069 2053190 logs.go:123] Gathering logs for containerd ...
	I1219 06:05:48.928081 2053190 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:05:48.967853 2053190 logs.go:123] Gathering logs for container status ...
	I1219 06:05:48.967874 2053190 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	W1219 06:05:48.999137 2053190 out.go:434] Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000326287s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	W1219 06:05:48.999180 2053190 out.go:285] * 
	W1219 06:05:48.999422 2053190 out.go:285] X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000326287s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1219 06:05:48.999478 2053190 out.go:285] * 
	W1219 06:05:49.003506 2053190 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1219 06:05:49.010063 2053190 out.go:203] 
	W1219 06:05:49.013043 2053190 out.go:285] X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000326287s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1219 06:05:49.013158 2053190 out.go:285] * Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	W1219 06:05:49.013192 2053190 out.go:285] * Related issue: https://github.com/kubernetes/minikube/issues/4172
	I1219 06:05:49.016281 2053190 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 19 05:57:39 functional-006924 containerd[766]: time="2025-12-19T05:57:39.712012339Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1
	Dec 19 05:57:39 functional-006924 containerd[766]: time="2025-12-19T05:57:39.712023449Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1
	Dec 19 05:57:39 functional-006924 containerd[766]: time="2025-12-19T05:57:39.712061340Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1
	Dec 19 05:57:39 functional-006924 containerd[766]: time="2025-12-19T05:57:39.712078046Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1
	Dec 19 05:57:39 functional-006924 containerd[766]: time="2025-12-19T05:57:39.712087901Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 19 05:57:39 functional-006924 containerd[766]: time="2025-12-19T05:57:39.712099593Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 19 05:57:39 functional-006924 containerd[766]: time="2025-12-19T05:57:39.712108693Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1
	Dec 19 05:57:39 functional-006924 containerd[766]: time="2025-12-19T05:57:39.712121928Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1
	Dec 19 05:57:39 functional-006924 containerd[766]: time="2025-12-19T05:57:39.712137747Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1
	Dec 19 05:57:39 functional-006924 containerd[766]: time="2025-12-19T05:57:39.712167393Z" level=info msg="Connect containerd service"
	Dec 19 05:57:39 functional-006924 containerd[766]: time="2025-12-19T05:57:39.712451465Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this"
	Dec 19 05:57:39 functional-006924 containerd[766]: time="2025-12-19T05:57:39.713105372Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config"
	Dec 19 05:57:39 functional-006924 containerd[766]: time="2025-12-19T05:57:39.728425952Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc
	Dec 19 05:57:39 functional-006924 containerd[766]: time="2025-12-19T05:57:39.728490847Z" level=info msg=serving... address=/run/containerd/containerd.sock
	Dec 19 05:57:39 functional-006924 containerd[766]: time="2025-12-19T05:57:39.728522700Z" level=info msg="Start subscribing containerd event"
	Dec 19 05:57:39 functional-006924 containerd[766]: time="2025-12-19T05:57:39.728565950Z" level=info msg="Start recovering state"
	Dec 19 05:57:39 functional-006924 containerd[766]: time="2025-12-19T05:57:39.768484534Z" level=info msg="Start event monitor"
	Dec 19 05:57:39 functional-006924 containerd[766]: time="2025-12-19T05:57:39.768549003Z" level=info msg="Start cni network conf syncer for default"
	Dec 19 05:57:39 functional-006924 containerd[766]: time="2025-12-19T05:57:39.768559686Z" level=info msg="Start streaming server"
	Dec 19 05:57:39 functional-006924 containerd[766]: time="2025-12-19T05:57:39.768568515Z" level=info msg="Registered namespace \"k8s.io\" with NRI"
	Dec 19 05:57:39 functional-006924 containerd[766]: time="2025-12-19T05:57:39.768578804Z" level=info msg="runtime interface starting up..."
	Dec 19 05:57:39 functional-006924 containerd[766]: time="2025-12-19T05:57:39.768585614Z" level=info msg="starting plugins..."
	Dec 19 05:57:39 functional-006924 containerd[766]: time="2025-12-19T05:57:39.768597660Z" level=info msg="Synchronizing NRI (plugin) with current runtime state"
	Dec 19 05:57:39 functional-006924 containerd[766]: time="2025-12-19T05:57:39.768930480Z" level=info msg="containerd successfully booted in 0.082002s"
	Dec 19 05:57:39 functional-006924 systemd[1]: Started containerd.service - containerd container runtime.
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:05:49.992393    4919 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:05:49.993190    4919 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:05:49.994860    4919 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:05:49.995249    4919 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:05:49.997038    4919 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec19 04:47] overlayfs: idmapped layers are currently not supported
	[Dec19 04:48] overlayfs: idmapped layers are currently not supported
	[Dec19 04:49] overlayfs: idmapped layers are currently not supported
	[Dec19 04:51] overlayfs: idmapped layers are currently not supported
	[Dec19 04:53] overlayfs: idmapped layers are currently not supported
	[Dec19 05:03] overlayfs: idmapped layers are currently not supported
	[Dec19 05:04] overlayfs: idmapped layers are currently not supported
	[Dec19 05:05] overlayfs: idmapped layers are currently not supported
	[Dec19 05:06] overlayfs: idmapped layers are currently not supported
	[ +12.793339] overlayfs: idmapped layers are currently not supported
	[Dec19 05:07] overlayfs: idmapped layers are currently not supported
	[Dec19 05:08] overlayfs: idmapped layers are currently not supported
	[Dec19 05:09] overlayfs: idmapped layers are currently not supported
	[Dec19 05:10] overlayfs: idmapped layers are currently not supported
	[Dec19 05:11] overlayfs: idmapped layers are currently not supported
	[Dec19 05:13] overlayfs: idmapped layers are currently not supported
	[Dec19 05:14] overlayfs: idmapped layers are currently not supported
	[Dec19 05:32] overlayfs: idmapped layers are currently not supported
	[Dec19 05:33] overlayfs: idmapped layers are currently not supported
	[Dec19 05:35] overlayfs: idmapped layers are currently not supported
	[Dec19 05:36] overlayfs: idmapped layers are currently not supported
	[Dec19 05:38] overlayfs: idmapped layers are currently not supported
	[Dec19 05:39] overlayfs: idmapped layers are currently not supported
	[Dec19 05:40] overlayfs: idmapped layers are currently not supported
	[Dec19 05:42] kauditd_printk_skb: 8 callbacks suppressed
	
	
	==> kernel <==
	 06:05:50 up 10:48,  0 user,  load average: 0.13, 0.36, 0.97
	Linux functional-006924 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 19 06:05:46 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 06:05:47 functional-006924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 318.
	Dec 19 06:05:47 functional-006924 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:05:47 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:05:47 functional-006924 kubelet[4726]: E1219 06:05:47.194162    4726 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 06:05:47 functional-006924 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 06:05:47 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 06:05:47 functional-006924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 319.
	Dec 19 06:05:47 functional-006924 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:05:47 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:05:47 functional-006924 kubelet[4732]: E1219 06:05:47.947828    4732 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 06:05:47 functional-006924 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 06:05:47 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 06:05:48 functional-006924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 320.
	Dec 19 06:05:48 functional-006924 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:05:48 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:05:48 functional-006924 kubelet[4759]: E1219 06:05:48.711756    4759 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 06:05:48 functional-006924 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 06:05:48 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 06:05:49 functional-006924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 321.
	Dec 19 06:05:49 functional-006924 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:05:49 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:05:49 functional-006924 kubelet[4838]: E1219 06:05:49.455260    4838 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 06:05:49 functional-006924 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 06:05:49 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:263: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-006924 -n functional-006924
helpers_test.go:263: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-006924 -n functional-006924: exit status 6 (334.353783ms)

                                                
                                                
-- stdout --
	Stopped
	WARNING: Your kubectl is pointing to stale minikube-vm.
	To fix the kubectl context, run `minikube update-context`

                                                
                                                
-- /stdout --
** stderr ** 
	E1219 06:05:50.477470 2058977 status.go:458] kubeconfig endpoint: get endpoint: "functional-006924" does not appear in /home/jenkins/minikube-integration/22230-1998525/kubeconfig

                                                
                                                
** /stderr **
helpers_test.go:263: status error: exit status 6 (may be ok)
helpers_test.go:265: "functional-006924" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/StartWithProxy (502.35s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/SoftStart (367.95s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/SoftStart
I1219 06:05:50.492816 2000386 config.go:182] Loaded profile config "functional-006924": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
functional_test.go:674: (dbg) Run:  out/minikube-linux-arm64 start -p functional-006924 --alsologtostderr -v=8
E1219 06:06:29.405793 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-125117/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 06:06:50.533942 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/addons-064622/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 06:06:57.091558 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-125117/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 06:10:27.485968 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/addons-064622/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 06:11:29.406018 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-125117/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
functional_test.go:674: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p functional-006924 --alsologtostderr -v=8: exit status 80 (6m5.187558736s)

                                                
                                                
-- stdout --
	* [functional-006924] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=22230
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/22230-1998525/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/22230-1998525/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the docker driver based on existing profile
	* Starting "functional-006924" primary control-plane node in "functional-006924" cluster
	* Pulling base image v0.0.48-1765966054-22186 ...
	* Preparing Kubernetes v1.35.0-rc.1 on containerd 2.2.0 ...
	* Verifying Kubernetes components...
	  - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	* Enabled addons: 
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1219 06:05:50.537990 2059048 out.go:360] Setting OutFile to fd 1 ...
	I1219 06:05:50.538849 2059048 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 06:05:50.538894 2059048 out.go:374] Setting ErrFile to fd 2...
	I1219 06:05:50.538913 2059048 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 06:05:50.539188 2059048 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22230-1998525/.minikube/bin
	I1219 06:05:50.539610 2059048 out.go:368] Setting JSON to false
	I1219 06:05:50.540502 2059048 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":38897,"bootTime":1766085454,"procs":155,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1219 06:05:50.540601 2059048 start.go:143] virtualization:  
	I1219 06:05:50.544140 2059048 out.go:179] * [functional-006924] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1219 06:05:50.547152 2059048 out.go:179]   - MINIKUBE_LOCATION=22230
	I1219 06:05:50.547218 2059048 notify.go:221] Checking for updates...
	I1219 06:05:50.550931 2059048 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1219 06:05:50.553869 2059048 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22230-1998525/kubeconfig
	I1219 06:05:50.556730 2059048 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22230-1998525/.minikube
	I1219 06:05:50.559634 2059048 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1219 06:05:50.562450 2059048 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1219 06:05:50.565702 2059048 config.go:182] Loaded profile config "functional-006924": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1219 06:05:50.565828 2059048 driver.go:422] Setting default libvirt URI to qemu:///system
	I1219 06:05:50.590709 2059048 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1219 06:05:50.590846 2059048 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1219 06:05:50.653898 2059048 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-19 06:05:50.644590744 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1219 06:05:50.654020 2059048 docker.go:319] overlay module found
	I1219 06:05:50.657204 2059048 out.go:179] * Using the docker driver based on existing profile
	I1219 06:05:50.660197 2059048 start.go:309] selected driver: docker
	I1219 06:05:50.660214 2059048 start.go:928] validating driver "docker" against &{Name:functional-006924 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APISer
verName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableC
oreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1219 06:05:50.660310 2059048 start.go:939] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1219 06:05:50.660408 2059048 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1219 06:05:50.713439 2059048 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-19 06:05:50.704333478 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1219 06:05:50.713872 2059048 cni.go:84] Creating CNI manager for ""
	I1219 06:05:50.713935 2059048 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1219 06:05:50.713992 2059048 start.go:353] cluster config:
	{Name:functional-006924 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Con
tainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath:
StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1219 06:05:50.717210 2059048 out.go:179] * Starting "functional-006924" primary control-plane node in "functional-006924" cluster
	I1219 06:05:50.719980 2059048 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1219 06:05:50.722935 2059048 out.go:179] * Pulling base image v0.0.48-1765966054-22186 ...
	I1219 06:05:50.726070 2059048 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1219 06:05:50.726124 2059048 preload.go:203] Found local preload: /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4
	I1219 06:05:50.726135 2059048 cache.go:65] Caching tarball of preloaded images
	I1219 06:05:50.726179 2059048 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon
	I1219 06:05:50.726225 2059048 preload.go:238] Found /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4 in cache, skipping download
	I1219 06:05:50.726236 2059048 cache.go:68] Finished verifying existence of preloaded tar for v1.35.0-rc.1 on containerd
	I1219 06:05:50.726339 2059048 profile.go:143] Saving config to /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/config.json ...
	I1219 06:05:50.745888 2059048 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon, skipping pull
	I1219 06:05:50.745915 2059048 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 exists in daemon, skipping load
	I1219 06:05:50.745932 2059048 cache.go:243] Successfully downloaded all kic artifacts
	I1219 06:05:50.745963 2059048 start.go:360] acquireMachinesLock for functional-006924: {Name:mkc84f48e83d18024791d45db780f3ccd746613a Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1219 06:05:50.746023 2059048 start.go:364] duration metric: took 37.752µs to acquireMachinesLock for "functional-006924"
	I1219 06:05:50.746049 2059048 start.go:96] Skipping create...Using existing machine configuration
	I1219 06:05:50.746059 2059048 fix.go:54] fixHost starting: 
	I1219 06:05:50.746334 2059048 cli_runner.go:164] Run: docker container inspect functional-006924 --format={{.State.Status}}
	I1219 06:05:50.762745 2059048 fix.go:112] recreateIfNeeded on functional-006924: state=Running err=<nil>
	W1219 06:05:50.762777 2059048 fix.go:138] unexpected machine state, will restart: <nil>
	I1219 06:05:50.765990 2059048 out.go:252] * Updating the running docker "functional-006924" container ...
	I1219 06:05:50.766020 2059048 machine.go:94] provisionDockerMachine start ...
	I1219 06:05:50.766101 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:50.782668 2059048 main.go:144] libmachine: Using SSH client type: native
	I1219 06:05:50.783000 2059048 main.go:144] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db5e0] 0x3ddae0 <nil>  [] 0s} 127.0.0.1 34704 <nil> <nil>}
	I1219 06:05:50.783017 2059048 main.go:144] libmachine: About to run SSH command:
	hostname
	I1219 06:05:50.940618 2059048 main.go:144] libmachine: SSH cmd err, output: <nil>: functional-006924
	
	I1219 06:05:50.940641 2059048 ubuntu.go:182] provisioning hostname "functional-006924"
	I1219 06:05:50.940708 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:50.964854 2059048 main.go:144] libmachine: Using SSH client type: native
	I1219 06:05:50.965181 2059048 main.go:144] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db5e0] 0x3ddae0 <nil>  [] 0s} 127.0.0.1 34704 <nil> <nil>}
	I1219 06:05:50.965199 2059048 main.go:144] libmachine: About to run SSH command:
	sudo hostname functional-006924 && echo "functional-006924" | sudo tee /etc/hostname
	I1219 06:05:51.129720 2059048 main.go:144] libmachine: SSH cmd err, output: <nil>: functional-006924
	
	I1219 06:05:51.129816 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:51.147357 2059048 main.go:144] libmachine: Using SSH client type: native
	I1219 06:05:51.147663 2059048 main.go:144] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db5e0] 0x3ddae0 <nil>  [] 0s} 127.0.0.1 34704 <nil> <nil>}
	I1219 06:05:51.147686 2059048 main.go:144] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-006924' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-006924/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-006924' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1219 06:05:51.301923 2059048 main.go:144] libmachine: SSH cmd err, output: <nil>: 
	I1219 06:05:51.301949 2059048 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/22230-1998525/.minikube CaCertPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/22230-1998525/.minikube}
	I1219 06:05:51.301977 2059048 ubuntu.go:190] setting up certificates
	I1219 06:05:51.301985 2059048 provision.go:84] configureAuth start
	I1219 06:05:51.302047 2059048 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-006924
	I1219 06:05:51.323653 2059048 provision.go:143] copyHostCerts
	I1219 06:05:51.323700 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.pem
	I1219 06:05:51.323742 2059048 exec_runner.go:144] found /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.pem, removing ...
	I1219 06:05:51.323756 2059048 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.pem
	I1219 06:05:51.323832 2059048 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.pem (1078 bytes)
	I1219 06:05:51.323915 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/22230-1998525/.minikube/cert.pem
	I1219 06:05:51.323932 2059048 exec_runner.go:144] found /home/jenkins/minikube-integration/22230-1998525/.minikube/cert.pem, removing ...
	I1219 06:05:51.323937 2059048 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22230-1998525/.minikube/cert.pem
	I1219 06:05:51.323964 2059048 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/22230-1998525/.minikube/cert.pem (1123 bytes)
	I1219 06:05:51.324003 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/22230-1998525/.minikube/key.pem
	I1219 06:05:51.324018 2059048 exec_runner.go:144] found /home/jenkins/minikube-integration/22230-1998525/.minikube/key.pem, removing ...
	I1219 06:05:51.324022 2059048 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22230-1998525/.minikube/key.pem
	I1219 06:05:51.324044 2059048 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/22230-1998525/.minikube/key.pem (1671 bytes)
	I1219 06:05:51.324090 2059048 provision.go:117] generating server cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca-key.pem org=jenkins.functional-006924 san=[127.0.0.1 192.168.49.2 functional-006924 localhost minikube]
	I1219 06:05:51.441821 2059048 provision.go:177] copyRemoteCerts
	I1219 06:05:51.441886 2059048 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1219 06:05:51.441926 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:51.459787 2059048 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:05:51.570296 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I1219 06:05:51.570372 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1219 06:05:51.588363 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server.pem -> /etc/docker/server.pem
	I1219 06:05:51.588477 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1219 06:05:51.605684 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I1219 06:05:51.605798 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1219 06:05:51.623473 2059048 provision.go:87] duration metric: took 321.473451ms to configureAuth
	I1219 06:05:51.623556 2059048 ubuntu.go:206] setting minikube options for container-runtime
	I1219 06:05:51.623741 2059048 config.go:182] Loaded profile config "functional-006924": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1219 06:05:51.623756 2059048 machine.go:97] duration metric: took 857.728961ms to provisionDockerMachine
	I1219 06:05:51.623765 2059048 start.go:293] postStartSetup for "functional-006924" (driver="docker")
	I1219 06:05:51.623788 2059048 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1219 06:05:51.623849 2059048 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1219 06:05:51.623892 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:51.641371 2059048 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:05:51.760842 2059048 ssh_runner.go:195] Run: cat /etc/os-release
	I1219 06:05:51.764225 2059048 command_runner.go:130] > PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	I1219 06:05:51.764245 2059048 command_runner.go:130] > NAME="Debian GNU/Linux"
	I1219 06:05:51.764250 2059048 command_runner.go:130] > VERSION_ID="12"
	I1219 06:05:51.764255 2059048 command_runner.go:130] > VERSION="12 (bookworm)"
	I1219 06:05:51.764259 2059048 command_runner.go:130] > VERSION_CODENAME=bookworm
	I1219 06:05:51.764263 2059048 command_runner.go:130] > ID=debian
	I1219 06:05:51.764268 2059048 command_runner.go:130] > HOME_URL="https://www.debian.org/"
	I1219 06:05:51.764273 2059048 command_runner.go:130] > SUPPORT_URL="https://www.debian.org/support"
	I1219 06:05:51.764281 2059048 command_runner.go:130] > BUG_REPORT_URL="https://bugs.debian.org/"
	I1219 06:05:51.764323 2059048 main.go:144] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1219 06:05:51.764339 2059048 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1219 06:05:51.764350 2059048 filesync.go:126] Scanning /home/jenkins/minikube-integration/22230-1998525/.minikube/addons for local assets ...
	I1219 06:05:51.764404 2059048 filesync.go:126] Scanning /home/jenkins/minikube-integration/22230-1998525/.minikube/files for local assets ...
	I1219 06:05:51.764485 2059048 filesync.go:149] local asset: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem -> 20003862.pem in /etc/ssl/certs
	I1219 06:05:51.764491 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem -> /etc/ssl/certs/20003862.pem
	I1219 06:05:51.764572 2059048 filesync.go:149] local asset: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/test/nested/copy/2000386/hosts -> hosts in /etc/test/nested/copy/2000386
	I1219 06:05:51.764576 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/test/nested/copy/2000386/hosts -> /etc/test/nested/copy/2000386/hosts
	I1219 06:05:51.764619 2059048 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/2000386
	I1219 06:05:51.772196 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem --> /etc/ssl/certs/20003862.pem (1708 bytes)
	I1219 06:05:51.790438 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/test/nested/copy/2000386/hosts --> /etc/test/nested/copy/2000386/hosts (40 bytes)
	I1219 06:05:51.808099 2059048 start.go:296] duration metric: took 184.303334ms for postStartSetup
	I1219 06:05:51.808203 2059048 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1219 06:05:51.808277 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:51.825566 2059048 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:05:51.929610 2059048 command_runner.go:130] > 14%
	I1219 06:05:51.930200 2059048 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1219 06:05:51.934641 2059048 command_runner.go:130] > 169G
	I1219 06:05:51.935117 2059048 fix.go:56] duration metric: took 1.189053781s for fixHost
	I1219 06:05:51.935139 2059048 start.go:83] releasing machines lock for "functional-006924", held for 1.189101272s
	I1219 06:05:51.935225 2059048 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-006924
	I1219 06:05:51.954055 2059048 ssh_runner.go:195] Run: cat /version.json
	I1219 06:05:51.954105 2059048 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1219 06:05:51.954110 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:51.954164 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:51.979421 2059048 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:05:51.998216 2059048 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:05:52.088735 2059048 command_runner.go:130] > {"iso_version": "v1.37.0-1765846775-22141", "kicbase_version": "v0.0.48-1765966054-22186", "minikube_version": "v1.37.0", "commit": "c344550999bcbb78f38b2df057224788bb2d30b2"}
	I1219 06:05:52.088901 2059048 ssh_runner.go:195] Run: systemctl --version
	I1219 06:05:52.184102 2059048 command_runner.go:130] > <a href="https://github.com/kubernetes/registry.k8s.io">Temporary Redirect</a>.
	I1219 06:05:52.186843 2059048 command_runner.go:130] > systemd 252 (252.39-1~deb12u1)
	I1219 06:05:52.186921 2059048 command_runner.go:130] > +PAM +AUDIT +SELINUX +APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT +QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified
	I1219 06:05:52.187021 2059048 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I1219 06:05:52.191424 2059048 command_runner.go:130] ! stat: cannot statx '/etc/cni/net.d/*loopback.conf*': No such file or directory
	W1219 06:05:52.191590 2059048 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1219 06:05:52.191669 2059048 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1219 06:05:52.199647 2059048 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1219 06:05:52.199671 2059048 start.go:496] detecting cgroup driver to use...
	I1219 06:05:52.199702 2059048 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1219 06:05:52.199771 2059048 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1219 06:05:52.215530 2059048 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1219 06:05:52.228927 2059048 docker.go:218] disabling cri-docker service (if available) ...
	I1219 06:05:52.229039 2059048 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1219 06:05:52.245166 2059048 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1219 06:05:52.258582 2059048 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1219 06:05:52.378045 2059048 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1219 06:05:52.513092 2059048 docker.go:234] disabling docker service ...
	I1219 06:05:52.513180 2059048 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1219 06:05:52.528704 2059048 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1219 06:05:52.542109 2059048 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1219 06:05:52.652456 2059048 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1219 06:05:52.767269 2059048 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1219 06:05:52.781039 2059048 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1219 06:05:52.797281 2059048 command_runner.go:130] > runtime-endpoint: unix:///run/containerd/containerd.sock
	I1219 06:05:52.797396 2059048 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1219 06:05:52.807020 2059048 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1219 06:05:52.816571 2059048 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1219 06:05:52.816661 2059048 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1219 06:05:52.826225 2059048 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1219 06:05:52.835109 2059048 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1219 06:05:52.843741 2059048 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1219 06:05:52.852504 2059048 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1219 06:05:52.860160 2059048 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1219 06:05:52.868883 2059048 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1219 06:05:52.877906 2059048 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1219 06:05:52.887403 2059048 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1219 06:05:52.894024 2059048 command_runner.go:130] > net.bridge.bridge-nf-call-iptables = 1
	I1219 06:05:52.894921 2059048 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1219 06:05:52.902164 2059048 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1219 06:05:53.021703 2059048 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1219 06:05:53.168216 2059048 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1219 06:05:53.168331 2059048 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1219 06:05:53.171951 2059048 command_runner.go:130] >   File: /run/containerd/containerd.sock
	I1219 06:05:53.172022 2059048 command_runner.go:130] >   Size: 0         	Blocks: 0          IO Block: 4096   socket
	I1219 06:05:53.172043 2059048 command_runner.go:130] > Device: 0,72	Inode: 1614        Links: 1
	I1219 06:05:53.172065 2059048 command_runner.go:130] > Access: (0660/srw-rw----)  Uid: (    0/    root)   Gid: (    0/    root)
	I1219 06:05:53.172084 2059048 command_runner.go:130] > Access: 2025-12-19 06:05:53.119867628 +0000
	I1219 06:05:53.172112 2059048 command_runner.go:130] > Modify: 2025-12-19 06:05:53.119867628 +0000
	I1219 06:05:53.172131 2059048 command_runner.go:130] > Change: 2025-12-19 06:05:53.119867628 +0000
	I1219 06:05:53.172148 2059048 command_runner.go:130] >  Birth: -
	I1219 06:05:53.172331 2059048 start.go:564] Will wait 60s for crictl version
	I1219 06:05:53.172432 2059048 ssh_runner.go:195] Run: which crictl
	I1219 06:05:53.175887 2059048 command_runner.go:130] > /usr/local/bin/crictl
	I1219 06:05:53.176199 2059048 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1219 06:05:53.203136 2059048 command_runner.go:130] > Version:  0.1.0
	I1219 06:05:53.203389 2059048 command_runner.go:130] > RuntimeName:  containerd
	I1219 06:05:53.203588 2059048 command_runner.go:130] > RuntimeVersion:  v2.2.0
	I1219 06:05:53.203784 2059048 command_runner.go:130] > RuntimeApiVersion:  v1
	I1219 06:05:53.207710 2059048 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.2.0
	RuntimeApiVersion:  v1
	I1219 06:05:53.207845 2059048 ssh_runner.go:195] Run: containerd --version
	I1219 06:05:53.235328 2059048 command_runner.go:130] > containerd containerd.io v2.2.0 1c4457e00facac03ce1d75f7b6777a7a851e5c41
	I1219 06:05:53.237219 2059048 ssh_runner.go:195] Run: containerd --version
	I1219 06:05:53.254490 2059048 command_runner.go:130] > containerd containerd.io v2.2.0 1c4457e00facac03ce1d75f7b6777a7a851e5c41
	I1219 06:05:53.262101 2059048 out.go:179] * Preparing Kubernetes v1.35.0-rc.1 on containerd 2.2.0 ...
	I1219 06:05:53.264978 2059048 cli_runner.go:164] Run: docker network inspect functional-006924 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1219 06:05:53.280549 2059048 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1219 06:05:53.284647 2059048 command_runner.go:130] > 192.168.49.1	host.minikube.internal
	I1219 06:05:53.284847 2059048 kubeadm.go:884] updating cluster {Name:functional-006924 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APIServerName:minikubeC
A APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false C
ustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1219 06:05:53.284979 2059048 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1219 06:05:53.285048 2059048 ssh_runner.go:195] Run: sudo crictl images --output json
	I1219 06:05:53.307306 2059048 command_runner.go:130] > {
	I1219 06:05:53.307331 2059048 command_runner.go:130] >   "images":  [
	I1219 06:05:53.307335 2059048 command_runner.go:130] >     {
	I1219 06:05:53.307345 2059048 command_runner.go:130] >       "id":  "sha256:b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c",
	I1219 06:05:53.307350 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.307356 2059048 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20250512-df8de77b"
	I1219 06:05:53.307360 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307365 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.307373 2059048 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a"
	I1219 06:05:53.307380 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307385 2059048 command_runner.go:130] >       "size":  "40636774",
	I1219 06:05:53.307391 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.307395 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.307402 2059048 command_runner.go:130] >     },
	I1219 06:05:53.307405 2059048 command_runner.go:130] >     {
	I1219 06:05:53.307417 2059048 command_runner.go:130] >       "id":  "sha256:ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6",
	I1219 06:05:53.307425 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.307431 2059048 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1219 06:05:53.307435 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307441 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.307450 2059048 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"
	I1219 06:05:53.307455 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307460 2059048 command_runner.go:130] >       "size":  "8034419",
	I1219 06:05:53.307463 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.307467 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.307470 2059048 command_runner.go:130] >     },
	I1219 06:05:53.307482 2059048 command_runner.go:130] >     {
	I1219 06:05:53.307492 2059048 command_runner.go:130] >       "id":  "sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf",
	I1219 06:05:53.307496 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.307501 2059048 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.13.1"
	I1219 06:05:53.307505 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307519 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.307528 2059048 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6"
	I1219 06:05:53.307534 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307538 2059048 command_runner.go:130] >       "size":  "21168808",
	I1219 06:05:53.307542 2059048 command_runner.go:130] >       "username":  "nonroot",
	I1219 06:05:53.307546 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.307549 2059048 command_runner.go:130] >     },
	I1219 06:05:53.307552 2059048 command_runner.go:130] >     {
	I1219 06:05:53.307559 2059048 command_runner.go:130] >       "id":  "sha256:271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57",
	I1219 06:05:53.307565 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.307570 2059048 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.6-0"
	I1219 06:05:53.307581 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307585 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.307592 2059048 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890"
	I1219 06:05:53.307598 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307602 2059048 command_runner.go:130] >       "size":  "21749640",
	I1219 06:05:53.307610 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.307614 2059048 command_runner.go:130] >         "value":  "0"
	I1219 06:05:53.307618 2059048 command_runner.go:130] >       },
	I1219 06:05:53.307622 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.307625 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.307631 2059048 command_runner.go:130] >     },
	I1219 06:05:53.307634 2059048 command_runner.go:130] >     {
	I1219 06:05:53.307641 2059048 command_runner.go:130] >       "id":  "sha256:3c6ba27e07aef16adb050828695bfe6206439147b9ade2a2a1777c276bf79a54",
	I1219 06:05:53.307647 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.307653 2059048 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.35.0-rc.1"
	I1219 06:05:53.307666 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307670 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.307682 2059048 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:58367b5c0428495c0c12411fa7a018f5d40fe57307b85d8935b1ed35706ff7ee"
	I1219 06:05:53.307689 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307693 2059048 command_runner.go:130] >       "size":  "24692223",
	I1219 06:05:53.307697 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.307708 2059048 command_runner.go:130] >         "value":  "0"
	I1219 06:05:53.307712 2059048 command_runner.go:130] >       },
	I1219 06:05:53.307716 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.307723 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.307726 2059048 command_runner.go:130] >     },
	I1219 06:05:53.307729 2059048 command_runner.go:130] >     {
	I1219 06:05:53.307736 2059048 command_runner.go:130] >       "id":  "sha256:a34b3483f25ba81aa72f3aeb607a8c756479e8497d8420acbcd2854162ebf84a",
	I1219 06:05:53.307742 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.307748 2059048 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.35.0-rc.1"
	I1219 06:05:53.307753 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307757 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.307765 2059048 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:57ab0f75f58d99f4be7bff7bdda015fcbf1b7c20e58ba2722c8c39f751dc8c98"
	I1219 06:05:53.307769 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307773 2059048 command_runner.go:130] >       "size":  "20672157",
	I1219 06:05:53.307779 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.307783 2059048 command_runner.go:130] >         "value":  "0"
	I1219 06:05:53.307788 2059048 command_runner.go:130] >       },
	I1219 06:05:53.307792 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.307796 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.307799 2059048 command_runner.go:130] >     },
	I1219 06:05:53.307802 2059048 command_runner.go:130] >     {
	I1219 06:05:53.307809 2059048 command_runner.go:130] >       "id":  "sha256:7e3acea3d87aa7ca234514e7f9c10450c7a7f87fc273fc9b5a220e2a2be1ce4e",
	I1219 06:05:53.307813 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.307821 2059048 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.35.0-rc.1"
	I1219 06:05:53.307826 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307830 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.307840 2059048 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:bdd1fa8b53558a2e1967379a36b085c93faf15581e5fa9f212baf679d89c5bb5"
	I1219 06:05:53.307845 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307849 2059048 command_runner.go:130] >       "size":  "22432301",
	I1219 06:05:53.307858 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.307864 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.307867 2059048 command_runner.go:130] >     },
	I1219 06:05:53.307870 2059048 command_runner.go:130] >     {
	I1219 06:05:53.307877 2059048 command_runner.go:130] >       "id":  "sha256:abca4d5226620be2218c3971464a1066651a743008c1db8720353446a4b7bbde",
	I1219 06:05:53.307884 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.307889 2059048 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.35.0-rc.1"
	I1219 06:05:53.307893 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307899 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.307907 2059048 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:8155e3db27c7081abfc8eb5da70820cfeaf0bba7449e45360e8220e670f417d3"
	I1219 06:05:53.307913 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307917 2059048 command_runner.go:130] >       "size":  "15405535",
	I1219 06:05:53.307921 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.307925 2059048 command_runner.go:130] >         "value":  "0"
	I1219 06:05:53.307928 2059048 command_runner.go:130] >       },
	I1219 06:05:53.307932 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.307939 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.307942 2059048 command_runner.go:130] >     },
	I1219 06:05:53.307948 2059048 command_runner.go:130] >     {
	I1219 06:05:53.307955 2059048 command_runner.go:130] >       "id":  "sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1219 06:05:53.307963 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.307967 2059048 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1219 06:05:53.307970 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307974 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.307982 2059048 command_runner.go:130] >         "registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c"
	I1219 06:05:53.307987 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307991 2059048 command_runner.go:130] >       "size":  "267939",
	I1219 06:05:53.307996 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.308000 2059048 command_runner.go:130] >         "value":  "65535"
	I1219 06:05:53.308004 2059048 command_runner.go:130] >       },
	I1219 06:05:53.308011 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.308015 2059048 command_runner.go:130] >       "pinned":  true
	I1219 06:05:53.308020 2059048 command_runner.go:130] >     }
	I1219 06:05:53.308027 2059048 command_runner.go:130] >   ]
	I1219 06:05:53.308030 2059048 command_runner.go:130] > }
	I1219 06:05:53.310449 2059048 containerd.go:627] all images are preloaded for containerd runtime.
	I1219 06:05:53.310472 2059048 containerd.go:534] Images already preloaded, skipping extraction
	I1219 06:05:53.310540 2059048 ssh_runner.go:195] Run: sudo crictl images --output json
	I1219 06:05:53.331271 2059048 command_runner.go:130] > {
	I1219 06:05:53.331288 2059048 command_runner.go:130] >   "images":  [
	I1219 06:05:53.331292 2059048 command_runner.go:130] >     {
	I1219 06:05:53.331304 2059048 command_runner.go:130] >       "id":  "sha256:b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c",
	I1219 06:05:53.331309 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.331314 2059048 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20250512-df8de77b"
	I1219 06:05:53.331318 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331322 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.331332 2059048 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a"
	I1219 06:05:53.331336 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331340 2059048 command_runner.go:130] >       "size":  "40636774",
	I1219 06:05:53.331350 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.331355 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.331358 2059048 command_runner.go:130] >     },
	I1219 06:05:53.331361 2059048 command_runner.go:130] >     {
	I1219 06:05:53.331369 2059048 command_runner.go:130] >       "id":  "sha256:ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6",
	I1219 06:05:53.331373 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.331378 2059048 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1219 06:05:53.331381 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331385 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.331393 2059048 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"
	I1219 06:05:53.331396 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331400 2059048 command_runner.go:130] >       "size":  "8034419",
	I1219 06:05:53.331404 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.331408 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.331411 2059048 command_runner.go:130] >     },
	I1219 06:05:53.331414 2059048 command_runner.go:130] >     {
	I1219 06:05:53.331421 2059048 command_runner.go:130] >       "id":  "sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf",
	I1219 06:05:53.331425 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.331430 2059048 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.13.1"
	I1219 06:05:53.331433 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331439 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.331447 2059048 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6"
	I1219 06:05:53.331451 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331454 2059048 command_runner.go:130] >       "size":  "21168808",
	I1219 06:05:53.331458 2059048 command_runner.go:130] >       "username":  "nonroot",
	I1219 06:05:53.331462 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.331466 2059048 command_runner.go:130] >     },
	I1219 06:05:53.331468 2059048 command_runner.go:130] >     {
	I1219 06:05:53.331475 2059048 command_runner.go:130] >       "id":  "sha256:271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57",
	I1219 06:05:53.331479 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.331484 2059048 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.6-0"
	I1219 06:05:53.331487 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331491 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.331502 2059048 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890"
	I1219 06:05:53.331506 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331510 2059048 command_runner.go:130] >       "size":  "21749640",
	I1219 06:05:53.331515 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.331519 2059048 command_runner.go:130] >         "value":  "0"
	I1219 06:05:53.331522 2059048 command_runner.go:130] >       },
	I1219 06:05:53.331526 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.331530 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.331533 2059048 command_runner.go:130] >     },
	I1219 06:05:53.331536 2059048 command_runner.go:130] >     {
	I1219 06:05:53.331543 2059048 command_runner.go:130] >       "id":  "sha256:3c6ba27e07aef16adb050828695bfe6206439147b9ade2a2a1777c276bf79a54",
	I1219 06:05:53.331547 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.331551 2059048 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.35.0-rc.1"
	I1219 06:05:53.331555 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331559 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.331566 2059048 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:58367b5c0428495c0c12411fa7a018f5d40fe57307b85d8935b1ed35706ff7ee"
	I1219 06:05:53.331569 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331573 2059048 command_runner.go:130] >       "size":  "24692223",
	I1219 06:05:53.331577 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.331585 2059048 command_runner.go:130] >         "value":  "0"
	I1219 06:05:53.331592 2059048 command_runner.go:130] >       },
	I1219 06:05:53.331596 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.331600 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.331603 2059048 command_runner.go:130] >     },
	I1219 06:05:53.331606 2059048 command_runner.go:130] >     {
	I1219 06:05:53.331613 2059048 command_runner.go:130] >       "id":  "sha256:a34b3483f25ba81aa72f3aeb607a8c756479e8497d8420acbcd2854162ebf84a",
	I1219 06:05:53.331617 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.331622 2059048 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.35.0-rc.1"
	I1219 06:05:53.331626 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331629 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.331638 2059048 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:57ab0f75f58d99f4be7bff7bdda015fcbf1b7c20e58ba2722c8c39f751dc8c98"
	I1219 06:05:53.331641 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331645 2059048 command_runner.go:130] >       "size":  "20672157",
	I1219 06:05:53.331652 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.331656 2059048 command_runner.go:130] >         "value":  "0"
	I1219 06:05:53.331659 2059048 command_runner.go:130] >       },
	I1219 06:05:53.331663 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.331666 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.331669 2059048 command_runner.go:130] >     },
	I1219 06:05:53.331672 2059048 command_runner.go:130] >     {
	I1219 06:05:53.331679 2059048 command_runner.go:130] >       "id":  "sha256:7e3acea3d87aa7ca234514e7f9c10450c7a7f87fc273fc9b5a220e2a2be1ce4e",
	I1219 06:05:53.331683 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.331688 2059048 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.35.0-rc.1"
	I1219 06:05:53.331691 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331695 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.331702 2059048 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:bdd1fa8b53558a2e1967379a36b085c93faf15581e5fa9f212baf679d89c5bb5"
	I1219 06:05:53.331705 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331709 2059048 command_runner.go:130] >       "size":  "22432301",
	I1219 06:05:53.331713 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.331717 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.331720 2059048 command_runner.go:130] >     },
	I1219 06:05:53.331723 2059048 command_runner.go:130] >     {
	I1219 06:05:53.331733 2059048 command_runner.go:130] >       "id":  "sha256:abca4d5226620be2218c3971464a1066651a743008c1db8720353446a4b7bbde",
	I1219 06:05:53.331737 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.331742 2059048 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.35.0-rc.1"
	I1219 06:05:53.331745 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331749 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.331757 2059048 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:8155e3db27c7081abfc8eb5da70820cfeaf0bba7449e45360e8220e670f417d3"
	I1219 06:05:53.331760 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331764 2059048 command_runner.go:130] >       "size":  "15405535",
	I1219 06:05:53.331767 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.331771 2059048 command_runner.go:130] >         "value":  "0"
	I1219 06:05:53.331774 2059048 command_runner.go:130] >       },
	I1219 06:05:53.331778 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.331782 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.331785 2059048 command_runner.go:130] >     },
	I1219 06:05:53.331792 2059048 command_runner.go:130] >     {
	I1219 06:05:53.331799 2059048 command_runner.go:130] >       "id":  "sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1219 06:05:53.331803 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.331807 2059048 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1219 06:05:53.331811 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331815 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.331822 2059048 command_runner.go:130] >         "registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c"
	I1219 06:05:53.331826 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331829 2059048 command_runner.go:130] >       "size":  "267939",
	I1219 06:05:53.331833 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.331837 2059048 command_runner.go:130] >         "value":  "65535"
	I1219 06:05:53.331841 2059048 command_runner.go:130] >       },
	I1219 06:05:53.331845 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.331849 2059048 command_runner.go:130] >       "pinned":  true
	I1219 06:05:53.331852 2059048 command_runner.go:130] >     }
	I1219 06:05:53.331855 2059048 command_runner.go:130] >   ]
	I1219 06:05:53.331858 2059048 command_runner.go:130] > }
	I1219 06:05:53.333541 2059048 containerd.go:627] all images are preloaded for containerd runtime.
	I1219 06:05:53.333565 2059048 cache_images.go:86] Images are preloaded, skipping loading
	I1219 06:05:53.333574 2059048 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-rc.1 containerd true true} ...
	I1219 06:05:53.333694 2059048 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-rc.1/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-006924 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1219 06:05:53.333773 2059048 ssh_runner.go:195] Run: sudo crictl --timeout=10s info
	I1219 06:05:53.354890 2059048 command_runner.go:130] > {
	I1219 06:05:53.354909 2059048 command_runner.go:130] >   "cniconfig": {
	I1219 06:05:53.354915 2059048 command_runner.go:130] >     "Networks": [
	I1219 06:05:53.354919 2059048 command_runner.go:130] >       {
	I1219 06:05:53.354926 2059048 command_runner.go:130] >         "Config": {
	I1219 06:05:53.354932 2059048 command_runner.go:130] >           "CNIVersion": "0.3.1",
	I1219 06:05:53.354937 2059048 command_runner.go:130] >           "Name": "cni-loopback",
	I1219 06:05:53.354941 2059048 command_runner.go:130] >           "Plugins": [
	I1219 06:05:53.354945 2059048 command_runner.go:130] >             {
	I1219 06:05:53.354949 2059048 command_runner.go:130] >               "Network": {
	I1219 06:05:53.354953 2059048 command_runner.go:130] >                 "ipam": {},
	I1219 06:05:53.354958 2059048 command_runner.go:130] >                 "type": "loopback"
	I1219 06:05:53.354962 2059048 command_runner.go:130] >               },
	I1219 06:05:53.354967 2059048 command_runner.go:130] >               "Source": "{\"type\":\"loopback\"}"
	I1219 06:05:53.354971 2059048 command_runner.go:130] >             }
	I1219 06:05:53.354975 2059048 command_runner.go:130] >           ],
	I1219 06:05:53.354988 2059048 command_runner.go:130] >           "Source": "{\n\"cniVersion\": \"0.3.1\",\n\"name\": \"cni-loopback\",\n\"plugins\": [{\n  \"type\": \"loopback\"\n}]\n}"
	I1219 06:05:53.354992 2059048 command_runner.go:130] >         },
	I1219 06:05:53.354997 2059048 command_runner.go:130] >         "IFName": "lo"
	I1219 06:05:53.355000 2059048 command_runner.go:130] >       }
	I1219 06:05:53.355003 2059048 command_runner.go:130] >     ],
	I1219 06:05:53.355007 2059048 command_runner.go:130] >     "PluginConfDir": "/etc/cni/net.d",
	I1219 06:05:53.355011 2059048 command_runner.go:130] >     "PluginDirs": [
	I1219 06:05:53.355015 2059048 command_runner.go:130] >       "/opt/cni/bin"
	I1219 06:05:53.355027 2059048 command_runner.go:130] >     ],
	I1219 06:05:53.355031 2059048 command_runner.go:130] >     "PluginMaxConfNum": 1,
	I1219 06:05:53.355036 2059048 command_runner.go:130] >     "Prefix": "eth"
	I1219 06:05:53.355039 2059048 command_runner.go:130] >   },
	I1219 06:05:53.355042 2059048 command_runner.go:130] >   "config": {
	I1219 06:05:53.355046 2059048 command_runner.go:130] >     "cdiSpecDirs": [
	I1219 06:05:53.355050 2059048 command_runner.go:130] >       "/etc/cdi",
	I1219 06:05:53.355059 2059048 command_runner.go:130] >       "/var/run/cdi"
	I1219 06:05:53.355062 2059048 command_runner.go:130] >     ],
	I1219 06:05:53.355066 2059048 command_runner.go:130] >     "cni": {
	I1219 06:05:53.355070 2059048 command_runner.go:130] >       "binDir": "",
	I1219 06:05:53.355073 2059048 command_runner.go:130] >       "binDirs": [
	I1219 06:05:53.355077 2059048 command_runner.go:130] >         "/opt/cni/bin"
	I1219 06:05:53.355080 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.355084 2059048 command_runner.go:130] >       "confDir": "/etc/cni/net.d",
	I1219 06:05:53.355088 2059048 command_runner.go:130] >       "confTemplate": "",
	I1219 06:05:53.355091 2059048 command_runner.go:130] >       "ipPref": "",
	I1219 06:05:53.355095 2059048 command_runner.go:130] >       "maxConfNum": 1,
	I1219 06:05:53.355099 2059048 command_runner.go:130] >       "setupSerially": false,
	I1219 06:05:53.355103 2059048 command_runner.go:130] >       "useInternalLoopback": false
	I1219 06:05:53.355106 2059048 command_runner.go:130] >     },
	I1219 06:05:53.355114 2059048 command_runner.go:130] >     "containerd": {
	I1219 06:05:53.355119 2059048 command_runner.go:130] >       "defaultRuntimeName": "runc",
	I1219 06:05:53.355123 2059048 command_runner.go:130] >       "ignoreBlockIONotEnabledErrors": false,
	I1219 06:05:53.355128 2059048 command_runner.go:130] >       "ignoreRdtNotEnabledErrors": false,
	I1219 06:05:53.355132 2059048 command_runner.go:130] >       "runtimes": {
	I1219 06:05:53.355136 2059048 command_runner.go:130] >         "runc": {
	I1219 06:05:53.355140 2059048 command_runner.go:130] >           "ContainerAnnotations": null,
	I1219 06:05:53.355145 2059048 command_runner.go:130] >           "PodAnnotations": null,
	I1219 06:05:53.355151 2059048 command_runner.go:130] >           "baseRuntimeSpec": "",
	I1219 06:05:53.355155 2059048 command_runner.go:130] >           "cgroupWritable": false,
	I1219 06:05:53.355159 2059048 command_runner.go:130] >           "cniConfDir": "",
	I1219 06:05:53.355163 2059048 command_runner.go:130] >           "cniMaxConfNum": 0,
	I1219 06:05:53.355167 2059048 command_runner.go:130] >           "io_type": "",
	I1219 06:05:53.355171 2059048 command_runner.go:130] >           "options": {
	I1219 06:05:53.355174 2059048 command_runner.go:130] >             "BinaryName": "",
	I1219 06:05:53.355179 2059048 command_runner.go:130] >             "CriuImagePath": "",
	I1219 06:05:53.355183 2059048 command_runner.go:130] >             "CriuWorkPath": "",
	I1219 06:05:53.355187 2059048 command_runner.go:130] >             "IoGid": 0,
	I1219 06:05:53.355190 2059048 command_runner.go:130] >             "IoUid": 0,
	I1219 06:05:53.355198 2059048 command_runner.go:130] >             "NoNewKeyring": false,
	I1219 06:05:53.355201 2059048 command_runner.go:130] >             "Root": "",
	I1219 06:05:53.355205 2059048 command_runner.go:130] >             "ShimCgroup": "",
	I1219 06:05:53.355210 2059048 command_runner.go:130] >             "SystemdCgroup": false
	I1219 06:05:53.355214 2059048 command_runner.go:130] >           },
	I1219 06:05:53.355219 2059048 command_runner.go:130] >           "privileged_without_host_devices": false,
	I1219 06:05:53.355225 2059048 command_runner.go:130] >           "privileged_without_host_devices_all_devices_allowed": false,
	I1219 06:05:53.355229 2059048 command_runner.go:130] >           "runtimePath": "",
	I1219 06:05:53.355233 2059048 command_runner.go:130] >           "runtimeType": "io.containerd.runc.v2",
	I1219 06:05:53.355238 2059048 command_runner.go:130] >           "sandboxer": "podsandbox",
	I1219 06:05:53.355242 2059048 command_runner.go:130] >           "snapshotter": ""
	I1219 06:05:53.355245 2059048 command_runner.go:130] >         }
	I1219 06:05:53.355248 2059048 command_runner.go:130] >       }
	I1219 06:05:53.355252 2059048 command_runner.go:130] >     },
	I1219 06:05:53.355262 2059048 command_runner.go:130] >     "containerdEndpoint": "/run/containerd/containerd.sock",
	I1219 06:05:53.355267 2059048 command_runner.go:130] >     "containerdRootDir": "/var/lib/containerd",
	I1219 06:05:53.355273 2059048 command_runner.go:130] >     "device_ownership_from_security_context": false,
	I1219 06:05:53.355277 2059048 command_runner.go:130] >     "disableApparmor": false,
	I1219 06:05:53.355282 2059048 command_runner.go:130] >     "disableHugetlbController": true,
	I1219 06:05:53.355286 2059048 command_runner.go:130] >     "disableProcMount": false,
	I1219 06:05:53.355290 2059048 command_runner.go:130] >     "drainExecSyncIOTimeout": "0s",
	I1219 06:05:53.355294 2059048 command_runner.go:130] >     "enableCDI": true,
	I1219 06:05:53.355298 2059048 command_runner.go:130] >     "enableSelinux": false,
	I1219 06:05:53.355302 2059048 command_runner.go:130] >     "enableUnprivilegedICMP": true,
	I1219 06:05:53.355306 2059048 command_runner.go:130] >     "enableUnprivilegedPorts": true,
	I1219 06:05:53.355311 2059048 command_runner.go:130] >     "ignoreDeprecationWarnings": null,
	I1219 06:05:53.355319 2059048 command_runner.go:130] >     "ignoreImageDefinedVolumes": false,
	I1219 06:05:53.355323 2059048 command_runner.go:130] >     "maxContainerLogLineSize": 16384,
	I1219 06:05:53.355328 2059048 command_runner.go:130] >     "netnsMountsUnderStateDir": false,
	I1219 06:05:53.355332 2059048 command_runner.go:130] >     "restrictOOMScoreAdj": false,
	I1219 06:05:53.355338 2059048 command_runner.go:130] >     "rootDir": "/var/lib/containerd/io.containerd.grpc.v1.cri",
	I1219 06:05:53.355342 2059048 command_runner.go:130] >     "selinuxCategoryRange": 1024,
	I1219 06:05:53.355347 2059048 command_runner.go:130] >     "stateDir": "/run/containerd/io.containerd.grpc.v1.cri",
	I1219 06:05:53.355357 2059048 command_runner.go:130] >     "tolerateMissingHugetlbController": true,
	I1219 06:05:53.355362 2059048 command_runner.go:130] >     "unsetSeccompProfile": ""
	I1219 06:05:53.355365 2059048 command_runner.go:130] >   },
	I1219 06:05:53.355369 2059048 command_runner.go:130] >   "features": {
	I1219 06:05:53.355373 2059048 command_runner.go:130] >     "supplemental_groups_policy": true
	I1219 06:05:53.355376 2059048 command_runner.go:130] >   },
	I1219 06:05:53.355379 2059048 command_runner.go:130] >   "golang": "go1.24.9",
	I1219 06:05:53.355389 2059048 command_runner.go:130] >   "lastCNILoadStatus": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1219 06:05:53.355399 2059048 command_runner.go:130] >   "lastCNILoadStatus.default": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1219 06:05:53.355402 2059048 command_runner.go:130] >   "runtimeHandlers": [
	I1219 06:05:53.355406 2059048 command_runner.go:130] >     {
	I1219 06:05:53.355409 2059048 command_runner.go:130] >       "features": {
	I1219 06:05:53.355414 2059048 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1219 06:05:53.355418 2059048 command_runner.go:130] >         "user_namespaces": true
	I1219 06:05:53.355421 2059048 command_runner.go:130] >       }
	I1219 06:05:53.355424 2059048 command_runner.go:130] >     },
	I1219 06:05:53.355427 2059048 command_runner.go:130] >     {
	I1219 06:05:53.355431 2059048 command_runner.go:130] >       "features": {
	I1219 06:05:53.355436 2059048 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1219 06:05:53.355440 2059048 command_runner.go:130] >         "user_namespaces": true
	I1219 06:05:53.355443 2059048 command_runner.go:130] >       },
	I1219 06:05:53.355447 2059048 command_runner.go:130] >       "name": "runc"
	I1219 06:05:53.355449 2059048 command_runner.go:130] >     }
	I1219 06:05:53.355452 2059048 command_runner.go:130] >   ],
	I1219 06:05:53.355456 2059048 command_runner.go:130] >   "status": {
	I1219 06:05:53.355460 2059048 command_runner.go:130] >     "conditions": [
	I1219 06:05:53.355463 2059048 command_runner.go:130] >       {
	I1219 06:05:53.355467 2059048 command_runner.go:130] >         "message": "",
	I1219 06:05:53.355471 2059048 command_runner.go:130] >         "reason": "",
	I1219 06:05:53.355475 2059048 command_runner.go:130] >         "status": true,
	I1219 06:05:53.355480 2059048 command_runner.go:130] >         "type": "RuntimeReady"
	I1219 06:05:53.355483 2059048 command_runner.go:130] >       },
	I1219 06:05:53.355486 2059048 command_runner.go:130] >       {
	I1219 06:05:53.355495 2059048 command_runner.go:130] >         "message": "Network plugin returns error: cni plugin not initialized",
	I1219 06:05:53.355500 2059048 command_runner.go:130] >         "reason": "NetworkPluginNotReady",
	I1219 06:05:53.355504 2059048 command_runner.go:130] >         "status": false,
	I1219 06:05:53.355508 2059048 command_runner.go:130] >         "type": "NetworkReady"
	I1219 06:05:53.355512 2059048 command_runner.go:130] >       },
	I1219 06:05:53.355515 2059048 command_runner.go:130] >       {
	I1219 06:05:53.355536 2059048 command_runner.go:130] >         "message": "{\"io.containerd.deprecation/cgroup-v1\":\"The support for cgroup v1 is deprecated since containerd v2.2 and will be removed by no later than May 2029. Upgrade the host to use cgroup v2.\"}",
	I1219 06:05:53.355541 2059048 command_runner.go:130] >         "reason": "ContainerdHasDeprecationWarnings",
	I1219 06:05:53.355547 2059048 command_runner.go:130] >         "status": false,
	I1219 06:05:53.355552 2059048 command_runner.go:130] >         "type": "ContainerdHasNoDeprecationWarnings"
	I1219 06:05:53.355555 2059048 command_runner.go:130] >       }
	I1219 06:05:53.355557 2059048 command_runner.go:130] >     ]
	I1219 06:05:53.355560 2059048 command_runner.go:130] >   }
	I1219 06:05:53.355563 2059048 command_runner.go:130] > }
	I1219 06:05:53.357747 2059048 cni.go:84] Creating CNI manager for ""
	I1219 06:05:53.357770 2059048 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1219 06:05:53.357795 2059048 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1219 06:05:53.357824 2059048 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-rc.1 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-006924 NodeName:functional-006924 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt Stati
cPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1219 06:05:53.357938 2059048 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-006924"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-rc.1
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1219 06:05:53.358021 2059048 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-rc.1
	I1219 06:05:53.365051 2059048 command_runner.go:130] > kubeadm
	I1219 06:05:53.365070 2059048 command_runner.go:130] > kubectl
	I1219 06:05:53.365074 2059048 command_runner.go:130] > kubelet
	I1219 06:05:53.366033 2059048 binaries.go:51] Found k8s binaries, skipping transfer
	I1219 06:05:53.366118 2059048 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1219 06:05:53.373810 2059048 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (326 bytes)
	I1219 06:05:53.386231 2059048 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (357 bytes)
	I1219 06:05:53.399156 2059048 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2235 bytes)
	I1219 06:05:53.411832 2059048 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1219 06:05:53.415476 2059048 command_runner.go:130] > 192.168.49.2	control-plane.minikube.internal
	I1219 06:05:53.415580 2059048 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1219 06:05:53.524736 2059048 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1219 06:05:53.900522 2059048 certs.go:69] Setting up /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924 for IP: 192.168.49.2
	I1219 06:05:53.900547 2059048 certs.go:195] generating shared ca certs ...
	I1219 06:05:53.900563 2059048 certs.go:227] acquiring lock for ca certs: {Name:mk382c71693ea4061363f97b153b21bf6cdf5f38 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1219 06:05:53.900702 2059048 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.key
	I1219 06:05:53.900780 2059048 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/proxy-client-ca.key
	I1219 06:05:53.900803 2059048 certs.go:257] generating profile certs ...
	I1219 06:05:53.900908 2059048 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.key
	I1219 06:05:53.900976 2059048 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.key.febe6fed
	I1219 06:05:53.901024 2059048 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/proxy-client.key
	I1219 06:05:53.901037 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I1219 06:05:53.901081 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I1219 06:05:53.901098 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I1219 06:05:53.901109 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I1219 06:05:53.901127 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I1219 06:05:53.901139 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I1219 06:05:53.901154 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I1219 06:05:53.901171 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I1219 06:05:53.901229 2059048 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/2000386.pem (1338 bytes)
	W1219 06:05:53.901264 2059048 certs.go:480] ignoring /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/2000386_empty.pem, impossibly tiny 0 bytes
	I1219 06:05:53.901277 2059048 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca-key.pem (1679 bytes)
	I1219 06:05:53.901306 2059048 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem (1078 bytes)
	I1219 06:05:53.901333 2059048 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/cert.pem (1123 bytes)
	I1219 06:05:53.901365 2059048 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/key.pem (1671 bytes)
	I1219 06:05:53.901418 2059048 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem (1708 bytes)
	I1219 06:05:53.901449 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:05:53.901465 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/2000386.pem -> /usr/share/ca-certificates/2000386.pem
	I1219 06:05:53.901481 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem -> /usr/share/ca-certificates/20003862.pem
	I1219 06:05:53.902039 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1219 06:05:53.926748 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1219 06:05:53.945718 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1219 06:05:53.964111 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I1219 06:05:53.984388 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1219 06:05:54.005796 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I1219 06:05:54.027058 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1219 06:05:54.045330 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I1219 06:05:54.062681 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1219 06:05:54.080390 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/2000386.pem --> /usr/share/ca-certificates/2000386.pem (1338 bytes)
	I1219 06:05:54.102399 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem --> /usr/share/ca-certificates/20003862.pem (1708 bytes)
	I1219 06:05:54.120580 2059048 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (722 bytes)
	I1219 06:05:54.133732 2059048 ssh_runner.go:195] Run: openssl version
	I1219 06:05:54.139799 2059048 command_runner.go:130] > OpenSSL 3.0.17 1 Jul 2025 (Library: OpenSSL 3.0.17 1 Jul 2025)
	I1219 06:05:54.140191 2059048 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/20003862.pem
	I1219 06:05:54.147812 2059048 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/20003862.pem /etc/ssl/certs/20003862.pem
	I1219 06:05:54.155315 2059048 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/20003862.pem
	I1219 06:05:54.159037 2059048 command_runner.go:130] > -rw-r--r-- 1 root root 1708 Dec 19 05:57 /usr/share/ca-certificates/20003862.pem
	I1219 06:05:54.159108 2059048 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 19 05:57 /usr/share/ca-certificates/20003862.pem
	I1219 06:05:54.159165 2059048 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/20003862.pem
	I1219 06:05:54.200029 2059048 command_runner.go:130] > 3ec20f2e
	I1219 06:05:54.200546 2059048 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1219 06:05:54.208733 2059048 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:05:54.216254 2059048 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1219 06:05:54.224240 2059048 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:05:54.228059 2059048 command_runner.go:130] > -rw-r--r-- 1 root root 1111 Dec 19 05:43 /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:05:54.228165 2059048 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 19 05:43 /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:05:54.228244 2059048 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:05:54.268794 2059048 command_runner.go:130] > b5213941
	I1219 06:05:54.269372 2059048 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1219 06:05:54.277054 2059048 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/2000386.pem
	I1219 06:05:54.284467 2059048 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/2000386.pem /etc/ssl/certs/2000386.pem
	I1219 06:05:54.291949 2059048 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/2000386.pem
	I1219 06:05:54.295750 2059048 command_runner.go:130] > -rw-r--r-- 1 root root 1338 Dec 19 05:57 /usr/share/ca-certificates/2000386.pem
	I1219 06:05:54.295798 2059048 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 19 05:57 /usr/share/ca-certificates/2000386.pem
	I1219 06:05:54.295849 2059048 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/2000386.pem
	I1219 06:05:54.341163 2059048 command_runner.go:130] > 51391683
	I1219 06:05:54.341782 2059048 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1219 06:05:54.349497 2059048 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1219 06:05:54.353229 2059048 command_runner.go:130] >   File: /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1219 06:05:54.353253 2059048 command_runner.go:130] >   Size: 1176      	Blocks: 8          IO Block: 4096   regular file
	I1219 06:05:54.353261 2059048 command_runner.go:130] > Device: 259,1	Inode: 1582667     Links: 1
	I1219 06:05:54.353268 2059048 command_runner.go:130] > Access: (0644/-rw-r--r--)  Uid: (    0/    root)   Gid: (    0/    root)
	I1219 06:05:54.353275 2059048 command_runner.go:130] > Access: 2025-12-19 06:01:47.245300782 +0000
	I1219 06:05:54.353281 2059048 command_runner.go:130] > Modify: 2025-12-19 05:57:42.198721757 +0000
	I1219 06:05:54.353286 2059048 command_runner.go:130] > Change: 2025-12-19 05:57:42.198721757 +0000
	I1219 06:05:54.353294 2059048 command_runner.go:130] >  Birth: 2025-12-19 05:57:42.198721757 +0000
	I1219 06:05:54.353372 2059048 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1219 06:05:54.398897 2059048 command_runner.go:130] > Certificate will not expire
	I1219 06:05:54.399374 2059048 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1219 06:05:54.440111 2059048 command_runner.go:130] > Certificate will not expire
	I1219 06:05:54.440565 2059048 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1219 06:05:54.481409 2059048 command_runner.go:130] > Certificate will not expire
	I1219 06:05:54.481968 2059048 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1219 06:05:54.522576 2059048 command_runner.go:130] > Certificate will not expire
	I1219 06:05:54.523020 2059048 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1219 06:05:54.563365 2059048 command_runner.go:130] > Certificate will not expire
	I1219 06:05:54.563892 2059048 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1219 06:05:54.604428 2059048 command_runner.go:130] > Certificate will not expire
	I1219 06:05:54.604920 2059048 kubeadm.go:401] StartCluster: {Name:functional-006924 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APIServerName:minikubeCA A
PIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false Cust
omQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1219 06:05:54.605002 2059048 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1219 06:05:54.605063 2059048 ssh_runner.go:195] Run: sudo -s eval "crictl --timeout=10s ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1219 06:05:54.631433 2059048 cri.go:92] found id: ""
	I1219 06:05:54.631512 2059048 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1219 06:05:54.638289 2059048 command_runner.go:130] > /var/lib/kubelet/config.yaml
	I1219 06:05:54.638353 2059048 command_runner.go:130] > /var/lib/kubelet/kubeadm-flags.env
	I1219 06:05:54.638374 2059048 command_runner.go:130] > /var/lib/minikube/etcd:
	I1219 06:05:54.639191 2059048 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1219 06:05:54.639207 2059048 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1219 06:05:54.639278 2059048 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1219 06:05:54.646289 2059048 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1219 06:05:54.646704 2059048 kubeconfig.go:47] verify endpoint returned: get endpoint: "functional-006924" does not appear in /home/jenkins/minikube-integration/22230-1998525/kubeconfig
	I1219 06:05:54.646809 2059048 kubeconfig.go:62] /home/jenkins/minikube-integration/22230-1998525/kubeconfig needs updating (will repair): [kubeconfig missing "functional-006924" cluster setting kubeconfig missing "functional-006924" context setting]
	I1219 06:05:54.647118 2059048 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22230-1998525/kubeconfig: {Name:mk7db1732c7d76f01100426cb283dc7515a3b9ea Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1219 06:05:54.647542 2059048 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/22230-1998525/kubeconfig
	I1219 06:05:54.647700 2059048 kapi.go:59] client config for functional-006924: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.crt", KeyFile:"/home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.key", CAFile:"/home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(ni
l), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1ffe230), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1219 06:05:54.648289 2059048 envvar.go:172] "Feature gate default state" feature="ClientsPreferCBOR" enabled=false
	I1219 06:05:54.648312 2059048 envvar.go:172] "Feature gate default state" feature="InOrderInformers" enabled=true
	I1219 06:05:54.648318 2059048 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I1219 06:05:54.648377 2059048 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I1219 06:05:54.648389 2059048 envvar.go:172] "Feature gate default state" feature="ClientsAllowCBOR" enabled=false
	I1219 06:05:54.648357 2059048 cert_rotation.go:141] "Starting client certificate rotation controller" logger="tls-transport-cache"
	I1219 06:05:54.648779 2059048 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1219 06:05:54.659696 2059048 kubeadm.go:635] The running cluster does not require reconfiguration: 192.168.49.2
	I1219 06:05:54.659739 2059048 kubeadm.go:602] duration metric: took 20.517186ms to restartPrimaryControlPlane
	I1219 06:05:54.659750 2059048 kubeadm.go:403] duration metric: took 54.838405ms to StartCluster
	I1219 06:05:54.659766 2059048 settings.go:142] acquiring lock: {Name:mk0fb518a1861caea9ce90c087e9f98ff93c6842 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1219 06:05:54.659859 2059048 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/22230-1998525/kubeconfig
	I1219 06:05:54.660602 2059048 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22230-1998525/kubeconfig: {Name:mk7db1732c7d76f01100426cb283dc7515a3b9ea Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1219 06:05:54.660878 2059048 start.go:236] Will wait 6m0s for node &{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I1219 06:05:54.661080 2059048 config.go:182] Loaded profile config "functional-006924": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1219 06:05:54.661197 2059048 addons.go:543] enable addons start: toEnable=map[ambassador:false amd-gpu-device-plugin:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I1219 06:05:54.661465 2059048 addons.go:70] Setting storage-provisioner=true in profile "functional-006924"
	I1219 06:05:54.661481 2059048 addons.go:239] Setting addon storage-provisioner=true in "functional-006924"
	I1219 06:05:54.661506 2059048 host.go:66] Checking if "functional-006924" exists ...
	I1219 06:05:54.661954 2059048 cli_runner.go:164] Run: docker container inspect functional-006924 --format={{.State.Status}}
	I1219 06:05:54.662128 2059048 addons.go:70] Setting default-storageclass=true in profile "functional-006924"
	I1219 06:05:54.662158 2059048 addons_storage_classes.go:34] enableOrDisableStorageClasses default-storageclass=true on "functional-006924"
	I1219 06:05:54.662427 2059048 cli_runner.go:164] Run: docker container inspect functional-006924 --format={{.State.Status}}
	I1219 06:05:54.667300 2059048 out.go:179] * Verifying Kubernetes components...
	I1219 06:05:54.673650 2059048 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1219 06:05:54.689683 2059048 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/22230-1998525/kubeconfig
	I1219 06:05:54.689848 2059048 kapi.go:59] client config for functional-006924: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.crt", KeyFile:"/home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.key", CAFile:"/home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(ni
l), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1ffe230), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1219 06:05:54.690123 2059048 addons.go:239] Setting addon default-storageclass=true in "functional-006924"
	I1219 06:05:54.690152 2059048 host.go:66] Checking if "functional-006924" exists ...
	I1219 06:05:54.690560 2059048 cli_runner.go:164] Run: docker container inspect functional-006924 --format={{.State.Status}}
	I1219 06:05:54.715008 2059048 out.go:179]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I1219 06:05:54.717850 2059048 addons.go:436] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:05:54.717879 2059048 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I1219 06:05:54.717946 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:54.734767 2059048 addons.go:436] installing /etc/kubernetes/addons/storageclass.yaml
	I1219 06:05:54.734788 2059048 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I1219 06:05:54.734856 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:54.764236 2059048 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:05:54.773070 2059048 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:05:54.876977 2059048 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1219 06:05:54.898675 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:05:54.923995 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:05:55.652544 2059048 node_ready.go:35] waiting up to 6m0s for node "functional-006924" to be "Ready" ...
	I1219 06:05:55.652680 2059048 type.go:165] "Request Body" body=""
	I1219 06:05:55.652777 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:05:55.653088 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:55.653133 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:55.653174 2059048 retry.go:31] will retry after 152.748ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:55.653242 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:55.653274 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:55.653290 2059048 retry.go:31] will retry after 222.401366ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:55.653368 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:05:55.806850 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:05:55.871164 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:55.871241 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:55.871268 2059048 retry.go:31] will retry after 248.166368ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:55.876351 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:05:55.932419 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:55.936105 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:55.936137 2059048 retry.go:31] will retry after 191.546131ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:56.120512 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:05:56.128049 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:05:56.153420 2059048 type.go:165] "Request Body" body=""
	I1219 06:05:56.153544 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:05:56.153844 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:05:56.188805 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:56.192400 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:56.192475 2059048 retry.go:31] will retry after 421.141509ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:56.203130 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:56.203228 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:56.203252 2059048 retry.go:31] will retry after 495.708783ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:56.614800 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:05:56.653236 2059048 type.go:165] "Request Body" body=""
	I1219 06:05:56.653361 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:05:56.653708 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:05:56.677894 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:56.677943 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:56.677993 2059048 retry.go:31] will retry after 980.857907ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:56.700099 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:05:56.755124 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:56.758623 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:56.758652 2059048 retry.go:31] will retry after 1.143622688s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:57.152911 2059048 type.go:165] "Request Body" body=""
	I1219 06:05:57.153042 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:05:57.153399 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:05:57.652868 2059048 type.go:165] "Request Body" body=""
	I1219 06:05:57.652947 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:05:57.653291 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:05:57.653378 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:05:57.659518 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:05:57.724667 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:57.724716 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:57.724735 2059048 retry.go:31] will retry after 900.329628ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:57.903067 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:05:57.986230 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:57.986314 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:57.986340 2059048 retry.go:31] will retry after 1.7845791s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:58.153671 2059048 type.go:165] "Request Body" body=""
	I1219 06:05:58.153749 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:05:58.154120 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:05:58.625732 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:05:58.653113 2059048 type.go:165] "Request Body" body=""
	I1219 06:05:58.653187 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:05:58.653475 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:05:58.712944 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:58.713042 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:58.713071 2059048 retry.go:31] will retry after 2.322946675s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:59.153740 2059048 type.go:165] "Request Body" body=""
	I1219 06:05:59.153822 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:05:59.154186 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:05:59.652843 2059048 type.go:165] "Request Body" body=""
	I1219 06:05:59.652927 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:05:59.653311 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:05:59.771577 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:05:59.835749 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:59.839442 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:59.839476 2059048 retry.go:31] will retry after 2.412907222s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:00.152821 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:00.152949 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:00.153306 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:00.153393 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:00.653320 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:00.653404 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:00.653734 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:01.036322 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:06:01.102362 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:01.106179 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:01.106214 2059048 retry.go:31] will retry after 2.139899672s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:01.153490 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:01.153572 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:01.153855 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:01.653656 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:01.653732 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:01.654026 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:02.152793 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:02.152870 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:02.153204 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:02.252582 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:06:02.312437 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:02.312479 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:02.312500 2059048 retry.go:31] will retry after 1.566668648s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:02.652881 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:02.652958 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:02.653230 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:02.653283 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:03.152957 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:03.153054 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:03.153393 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:03.246844 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:06:03.302237 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:03.305728 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:03.305771 2059048 retry.go:31] will retry after 6.170177016s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:03.653408 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:03.653482 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:03.653834 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:03.880237 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:06:03.939688 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:03.939736 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:03.939756 2059048 retry.go:31] will retry after 4.919693289s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:04.153025 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:04.153101 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:04.153368 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:04.653333 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:04.653405 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:04.653716 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:04.653762 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:05.153589 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:05.153680 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:05.154012 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:05.652866 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:05.652946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:05.653271 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:06.152846 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:06.152922 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:06.153277 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:06.652913 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:06.652987 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:06.653350 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:07.152875 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:07.152947 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:07.153248 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:07.153305 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:07.652846 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:07.652924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:07.653261 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:08.152864 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:08.152938 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:08.153236 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:08.652856 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:08.652924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:08.653179 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:08.859603 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:06:08.923746 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:08.923802 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:08.923824 2059048 retry.go:31] will retry after 7.49455239s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:09.153273 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:09.153361 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:09.153733 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:09.153794 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:09.476166 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:06:09.536340 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:09.536378 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:09.536397 2059048 retry.go:31] will retry after 3.264542795s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:09.652787 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:09.652863 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:09.653191 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:10.152879 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:10.152955 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:10.153217 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:10.653092 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:10.653172 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:10.653505 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:11.153189 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:11.153267 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:11.153564 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:11.653360 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:11.653432 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:11.653748 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:11.653809 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:12.153584 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:12.153667 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:12.154066 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:12.652816 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:12.652897 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:12.653232 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:12.801732 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:06:12.858668 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:12.858722 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:12.858742 2059048 retry.go:31] will retry after 7.015856992s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:13.152872 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:13.152946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:13.153207 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:13.652838 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:13.652915 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:13.653206 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:14.152876 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:14.152957 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:14.153277 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:14.153340 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:14.653224 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:14.653299 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:14.653566 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:15.153381 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:15.153458 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:15.153856 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:15.653715 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:15.653796 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:15.654137 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:16.153469 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:16.153543 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:16.153826 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:16.153868 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:16.419404 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:06:16.476671 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:16.480081 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:16.480119 2059048 retry.go:31] will retry after 7.9937579s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:16.653575 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:16.653716 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:16.653985 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:17.153751 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:17.153850 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:17.154134 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:17.652872 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:17.652939 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:17.653201 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:18.152893 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:18.152976 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:18.153301 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:18.652860 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:18.652932 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:18.653233 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:18.653289 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:19.152869 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:19.152958 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:19.153227 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:19.652934 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:19.653010 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:19.653354 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:19.875781 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:06:19.950537 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:19.954067 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:19.954097 2059048 retry.go:31] will retry after 12.496952157s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:20.153579 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:20.153656 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:20.154027 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:20.652751 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:20.652846 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:20.653178 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:21.153037 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:21.153112 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:21.153446 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:21.153504 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:21.652852 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:21.652937 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:21.653238 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:22.152882 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:22.152951 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:22.153207 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:22.652851 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:22.652925 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:22.653261 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:23.152820 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:23.152899 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:23.153240 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:23.652818 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:23.652892 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:23.653158 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:23.653200 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:24.152783 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:24.152857 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:24.153198 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:24.474774 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:06:24.538538 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:24.538585 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:24.538605 2059048 retry.go:31] will retry after 14.635173495s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:24.653139 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:24.653215 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:24.653538 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:25.153284 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:25.153354 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:25.153661 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:25.653607 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:25.653689 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:25.653986 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:25.654040 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:26.152728 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:26.152857 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:26.153186 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:26.652777 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:26.652852 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:26.653175 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:27.152839 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:27.152916 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:27.153210 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:27.652846 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:27.652930 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:27.653284 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:28.152874 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:28.152956 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:28.153232 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:28.153286 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:28.652849 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:28.652924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:28.653242 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:29.152838 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:29.152930 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:29.153299 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:29.652870 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:29.652945 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:29.653242 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:30.152853 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:30.152937 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:30.153290 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:30.153348 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:30.653022 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:30.653115 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:30.653416 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:31.152873 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:31.152960 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:31.153275 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:31.652985 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:31.653057 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:31.653405 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:32.152845 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:32.152921 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:32.153253 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:32.451758 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:06:32.506473 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:32.509966 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:32.509998 2059048 retry.go:31] will retry after 31.028140902s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:32.653234 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:32.653309 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:32.653632 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:32.653749 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:33.153497 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:33.153583 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:33.153949 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:33.652732 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:33.652832 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:33.653182 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:34.152891 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:34.152964 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:34.153263 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:34.653098 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:34.653173 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:34.653525 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:35.153363 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:35.153489 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:35.153845 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:35.153907 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:35.653568 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:35.653649 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:35.653928 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:36.153725 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:36.153800 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:36.154115 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:36.652850 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:36.652933 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:36.653274 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:37.153419 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:37.153492 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:37.153866 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:37.153952 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:37.653726 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:37.653797 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:37.654143 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:38.152858 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:38.152933 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:38.153263 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:38.652866 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:38.652935 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:38.653195 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:39.152856 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:39.152932 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:39.153265 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:39.174643 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:06:39.239291 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:39.239335 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:39.239354 2059048 retry.go:31] will retry after 15.420333699s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:39.652870 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:39.652944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:39.653261 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:39.653316 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:40.152873 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:40.152963 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:40.153285 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:40.653056 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:40.653131 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:40.653494 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:41.153188 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:41.153263 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:41.153588 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:41.652865 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:41.652932 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:41.653248 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:42.152875 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:42.152964 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:42.153379 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:42.153461 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:42.652919 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:42.653000 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:42.653314 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:43.152873 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:43.152946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:43.153240 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:43.652956 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:43.653027 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:43.653379 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:44.152963 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:44.153044 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:44.153381 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:44.653201 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:44.653284 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:44.653550 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:44.653592 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:45.153794 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:45.153882 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:45.154325 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:45.653107 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:45.653190 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:45.653497 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:46.152858 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:46.152925 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:46.153184 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:46.652851 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:46.652926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:46.653256 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:47.152863 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:47.152944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:47.153246 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:47.153293 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:47.652875 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:47.652964 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:47.653331 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:48.152820 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:48.152904 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:48.153254 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:48.652976 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:48.653054 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:48.653401 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:49.152928 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:49.153003 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:49.153264 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:49.652842 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:49.652922 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:49.653266 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:49.653325 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:50.152835 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:50.152916 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:50.153230 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:50.653021 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:50.653097 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:50.653360 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:51.152819 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:51.152892 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:51.153216 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:51.652938 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:51.653021 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:51.653350 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:51.653404 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:52.152878 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:52.152997 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:52.153340 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:52.653054 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:52.653126 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:52.653428 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:53.152809 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:53.152885 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:53.153202 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:53.652872 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:53.652946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:53.653212 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:54.152921 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:54.153000 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:54.153307 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:54.153361 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:54.653430 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:54.653504 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:54.653886 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:54.660097 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:06:54.724740 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:54.724806 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:54.724824 2059048 retry.go:31] will retry after 21.489743806s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:55.153047 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:55.153170 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:55.153542 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:55.653137 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:55.653210 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:55.653500 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:56.153216 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:56.153285 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:56.153620 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:56.153682 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:56.653420 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:56.653501 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:56.653832 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:57.153605 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:57.153702 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:57.154020 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:57.652746 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:57.652846 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:57.653184 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:58.152882 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:58.152958 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:58.153278 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:58.652843 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:58.652924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:58.653216 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:58.653262 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:59.152798 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:59.152874 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:59.153218 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:59.652867 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:59.652934 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:59.653193 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:00.155125 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:00.155210 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:00.156183 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:00.653343 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:00.653416 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:00.653737 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:00.653787 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:01.152970 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:01.153062 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:01.153389 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:01.652835 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:01.652908 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:01.653245 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:02.152861 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:02.152952 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:02.153330 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:02.652900 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:02.652986 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:02.653368 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:03.152826 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:03.152908 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:03.153251 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:03.153306 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:03.538820 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:07:03.598261 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:07:03.602187 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:07:03.602221 2059048 retry.go:31] will retry after 27.693032791s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:07:03.653410 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:03.653486 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:03.653840 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:04.153298 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:04.153371 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:04.153670 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:04.653539 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:04.653618 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:04.653956 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:05.153749 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:05.153837 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:05.154160 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:05.154219 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:05.653149 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:05.653217 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:05.653546 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:06.153378 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:06.153468 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:06.153799 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:06.653420 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:06.653494 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:06.653803 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:07.153113 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:07.153187 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:07.153451 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:07.652897 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:07.652979 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:07.653292 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:07.653351 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:08.152825 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:08.152901 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:08.153274 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:08.652859 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:08.652926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:08.653238 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:09.153667 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:09.153756 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:09.154076 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:09.652824 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:09.652899 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:09.653232 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:10.153341 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:10.153410 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:10.153757 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:10.153818 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:10.653710 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:10.653802 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:10.654164 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:11.152777 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:11.152862 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:11.153199 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:11.652877 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:11.652946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:11.653219 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:12.152831 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:12.152908 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:12.153224 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:12.652832 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:12.652911 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:12.653226 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:12.653273 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:13.152877 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:13.152951 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:13.153279 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:13.652822 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:13.652904 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:13.653241 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:14.152814 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:14.152897 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:14.153218 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:14.653177 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:14.653250 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:14.653558 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:14.653611 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:15.153356 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:15.153436 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:15.153788 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:15.652725 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:15.652816 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:15.653161 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:16.152875 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:16.152948 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:16.153206 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:16.215537 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:07:16.273841 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:07:16.273881 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:07:16.273899 2059048 retry.go:31] will retry after 30.872906877s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:07:16.653514 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:16.653598 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:16.653919 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:16.653970 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:17.153579 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:17.153656 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:17.153994 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:17.653597 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:17.653665 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:17.653945 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:18.152782 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:18.152859 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:18.153155 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:18.652836 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:18.652926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:18.653269 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:19.152861 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:19.152939 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:19.153239 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:19.153292 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:19.652841 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:19.652916 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:19.653250 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:20.152836 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:20.152910 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:20.153258 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:20.653266 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:20.653354 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:20.653711 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:21.153511 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:21.153585 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:21.153886 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:21.153948 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:21.653690 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:21.653776 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:21.654081 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:22.153312 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:22.153387 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:22.153749 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:22.653581 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:22.653661 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:22.654117 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:23.152715 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:23.152802 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:23.153141 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:23.652867 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:23.652936 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:23.653196 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:23.653241 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:24.152848 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:24.152934 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:24.153277 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:24.653127 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:24.653203 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:24.653560 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:25.153321 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:25.153393 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:25.153662 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:25.652744 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:25.652855 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:25.653239 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:25.653298 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:26.152968 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:26.153049 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:26.153397 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:26.652880 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:26.652963 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:26.653275 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:27.152811 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:27.152888 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:27.153207 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:27.652936 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:27.653013 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:27.653346 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:27.653407 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:28.152886 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:28.152961 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:28.153298 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:28.652829 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:28.652904 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:28.653240 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:29.152818 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:29.152890 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:29.153229 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:29.652862 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:29.652947 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:29.653200 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:30.152832 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:30.152918 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:30.153263 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:30.153321 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:30.652996 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:30.653069 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:30.653387 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:31.152872 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:31.152950 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:31.153277 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:31.295743 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:07:31.365905 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:07:31.365953 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:07:31.366070 2059048 out.go:285] ! Enabling 'default-storageclass' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	! Enabling 'default-storageclass' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1219 06:07:31.653338 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:31.653413 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:31.653757 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:32.153415 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:32.153519 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:32.153862 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:32.153934 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:32.653181 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:32.653249 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:32.653512 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:33.152815 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:33.152899 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:33.153193 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:33.652817 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:33.652892 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:33.653195 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:34.152859 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:34.152941 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:34.153251 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:34.653155 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:34.653231 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:34.653574 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:34.653631 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:35.153386 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:35.153461 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:35.153800 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:35.652767 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:35.652837 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:35.653104 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:36.152871 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:36.152949 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:36.153263 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:36.652848 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:36.652927 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:36.653291 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:37.152893 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:37.152978 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:37.153238 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:37.153278 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:37.652927 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:37.653002 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:37.653295 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:38.152985 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:38.153059 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:38.153404 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:38.652889 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:38.652964 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:38.653220 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:39.152836 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:39.152920 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:39.153233 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:39.153294 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:39.652812 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:39.652889 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:39.653215 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:40.152857 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:40.152932 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:40.153187 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:40.653073 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:40.653148 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:40.653479 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:41.152804 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:41.152885 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:41.153180 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:41.652771 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:41.652841 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:41.653154 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:41.653206 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:42.152884 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:42.152963 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:42.153327 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:42.652853 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:42.652951 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:42.653271 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:43.152861 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:43.152931 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:43.153232 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:43.652842 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:43.652918 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:43.653242 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:43.653314 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:44.152994 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:44.153073 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:44.153402 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:44.653413 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:44.653502 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:44.653799 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:45.153668 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:45.153801 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:45.154199 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:45.653080 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:45.653158 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:45.653481 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:45.653538 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:46.153253 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:46.153372 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:46.153620 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:46.653410 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:46.653505 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:46.653901 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:47.147624 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:07:47.153238 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:47.153313 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:47.153618 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:47.207245 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:07:47.207289 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:07:47.207381 2059048 out.go:285] ! Enabling 'storage-provisioner' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	! Enabling 'storage-provisioner' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1219 06:07:47.212201 2059048 out.go:179] * Enabled addons: 
	I1219 06:07:47.215092 2059048 addons.go:546] duration metric: took 1m52.553895373s for enable addons: enabled=[]
	I1219 06:07:47.652840 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:47.652917 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:47.653177 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:48.152861 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:48.152936 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:48.153274 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:48.153336 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:48.652862 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:48.652941 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:48.653266 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:49.152877 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:49.152950 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:49.153222 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:49.652844 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:49.652940 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:49.653312 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:50.152823 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:50.152906 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:50.153448 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:50.153518 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:50.653031 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:50.653101 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:50.653362 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:51.153113 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:51.153194 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:51.153608 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:51.653414 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:51.653487 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:51.653829 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:52.153249 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:52.153337 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:52.153602 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:52.153645 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:52.653349 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:52.653422 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:52.653735 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:53.153542 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:53.153620 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:53.153960 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:53.653712 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:53.653793 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:53.654088 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:54.153153 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:54.153246 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:54.153650 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:54.153714 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:54.653597 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:54.653675 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:54.654059 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:55.153390 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:55.153470 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:55.153780 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:55.652892 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:55.652968 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:55.653343 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:56.153064 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:56.153144 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:56.153504 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:56.652934 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:56.653001 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:56.653305 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:56.653374 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:57.152823 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:57.152898 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:57.153265 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:57.652979 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:57.653054 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:57.653394 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:58.152861 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:58.152935 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:58.153224 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:58.652841 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:58.652916 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:58.653243 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:59.152863 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:59.152942 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:59.153252 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:59.153305 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:59.652871 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:59.652957 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:59.653221 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:00.152926 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:00.153011 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:00.153341 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:00.653583 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:00.653664 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:00.654050 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:01.153430 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:01.153505 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:01.153842 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:01.153899 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:01.653658 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:01.653734 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:01.654077 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:02.152810 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:02.152894 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:02.153236 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:02.652906 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:02.652992 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:02.653255 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:03.152814 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:03.152911 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:03.153241 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:03.652846 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:03.652946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:03.653311 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:03.653372 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:04.152873 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:04.152947 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:04.153272 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:04.653311 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:04.653393 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:04.653691 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:05.153373 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:05.153449 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:05.153786 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:05.653505 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:05.653577 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:05.653866 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:05.653911 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:06.153684 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:06.153763 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:06.154116 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:06.652849 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:06.652928 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:06.653265 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:07.152801 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:07.152875 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:07.153140 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:07.652821 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:07.652904 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:07.653239 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:08.152989 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:08.153069 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:08.153365 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:08.153412 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:08.652920 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:08.652987 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:08.653255 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:09.152797 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:09.152875 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:09.153215 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:09.652928 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:09.653026 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:09.653367 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:10.152861 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:10.152928 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:10.153183 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:10.653055 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:10.653153 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:10.653481 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:10.653536 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:11.152864 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:11.152945 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:11.153256 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:11.652878 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:11.652952 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:11.653279 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:12.152810 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:12.152885 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:12.153241 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:12.652960 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:12.653057 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:12.653342 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:13.152864 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:13.152943 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:13.153205 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:13.153250 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:13.652804 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:13.652895 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:13.653245 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:14.152963 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:14.153048 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:14.153407 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:14.653382 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:14.653457 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:14.653810 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:15.153570 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:15.153650 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:15.153993 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:15.154055 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:15.652797 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:15.652875 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:15.653205 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:16.152855 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:16.152927 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:16.153181 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:16.652842 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:16.652938 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:16.653237 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:17.152851 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:17.152930 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:17.153255 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:17.652875 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:17.652952 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:17.653273 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:17.653327 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:18.152822 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:18.152897 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:18.153211 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:18.652850 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:18.652929 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:18.653226 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:19.152862 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:19.152939 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:19.153206 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:19.652841 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:19.652943 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:19.653281 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:20.152968 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:20.153059 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:20.153353 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:20.153402 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:20.652914 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:20.652990 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:20.653265 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:21.152823 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:21.152899 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:21.153199 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:21.652903 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:21.652980 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:21.653286 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:22.152864 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:22.152936 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:22.153212 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:22.652848 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:22.652926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:22.653257 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:22.653311 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:23.152987 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:23.153082 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:23.153450 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:23.652860 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:23.652933 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:23.653264 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:24.152875 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:24.152959 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:24.153259 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:24.653188 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:24.653259 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:24.653621 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:24.653676 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:25.152855 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:25.152925 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:25.153189 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:25.653096 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:25.653176 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:25.653514 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:26.153303 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:26.153380 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:26.153718 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:26.653504 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:26.653583 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:26.653866 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:26.653917 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:27.153641 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:27.153723 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:27.154070 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:27.653768 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:27.653851 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:27.654198 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:28.152880 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:28.152951 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:28.153263 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:28.652865 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:28.652944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:28.653286 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:29.152996 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:29.153076 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:29.153423 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:29.153485 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:29.652870 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:29.652937 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:29.653208 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:30.152848 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:30.152931 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:30.153247 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:30.653099 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:30.653178 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:30.653543 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:31.152864 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:31.152933 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:31.153184 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:31.652836 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:31.652916 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:31.653254 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:31.653310 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:32.152845 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:32.152923 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:32.153234 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:32.652869 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:32.652974 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:32.653291 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:33.152813 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:33.152892 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:33.153182 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:33.652873 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:33.652952 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:33.653279 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:33.653339 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:34.152888 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:34.152957 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:34.153206 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:34.653221 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:34.653303 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:34.653662 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:35.153491 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:35.153567 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:35.153923 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:35.653686 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:35.653756 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:35.654034 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:35.654075 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:36.152742 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:36.152852 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:36.153178 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:36.652917 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:36.652991 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:36.653328 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:37.152872 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:37.152944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:37.153269 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:37.652832 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:37.652905 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:37.653225 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:38.152823 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:38.152901 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:38.153256 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:38.153311 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:38.652878 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:38.652945 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:38.653254 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:39.152844 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:39.152917 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:39.153253 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:39.652865 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:39.652948 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:39.653281 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:40.152846 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:40.152936 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:40.153201 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:40.653085 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:40.653160 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:40.653488 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:40.653543 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:41.152787 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:41.152870 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:41.153181 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:41.652770 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:41.652846 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:41.653122 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:42.152887 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:42.152974 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:42.153376 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:42.653103 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:42.653188 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:42.653511 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:42.653570 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:43.152867 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:43.152944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:43.153205 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:43.652856 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:43.652941 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:43.653311 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:44.153027 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:44.153105 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:44.153433 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:44.653459 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:44.653530 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:44.653788 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:44.653840 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:45.153678 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:45.153766 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:45.156105 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=2
	I1219 06:08:45.653114 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:45.653196 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:45.653533 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:46.152871 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:46.152948 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:46.153207 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:46.652877 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:46.652950 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:46.653258 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:47.152995 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:47.153106 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:47.153459 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:47.153515 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:47.652878 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:47.652955 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:47.653273 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:48.152852 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:48.152926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:48.153282 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:48.652874 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:48.652954 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:48.653318 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:49.152869 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:49.152946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:49.153202 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:49.652841 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:49.652924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:49.653230 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:49.653282 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:50.152954 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:50.153027 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:50.153317 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:50.653024 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:50.653102 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:50.653365 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:51.152850 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:51.152924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:51.153219 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:51.652848 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:51.652934 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:51.653275 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:51.653343 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:52.152878 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:52.152957 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:52.153276 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:52.652974 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:52.653059 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:52.653395 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:53.153096 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:53.153174 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:53.153508 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:53.652872 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:53.652951 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:53.653281 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:54.152839 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:54.152916 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:54.153228 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:54.153274 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:54.653199 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:54.653273 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:54.653604 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:55.153420 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:55.153510 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:55.153789 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:55.652811 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:55.652903 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:55.653294 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:56.152856 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:56.152934 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:56.153223 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:56.652860 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:56.652926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:56.653259 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:56.653316 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:57.152833 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:57.152911 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:57.153205 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:57.652906 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:57.652981 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:57.653323 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:58.152864 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:58.152937 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:58.153209 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:58.652840 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:58.652922 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:58.653217 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:59.152905 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:59.152981 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:59.153315 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:59.153381 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:59.652855 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:59.652932 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:59.653183 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:00.152906 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:00.153005 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:00.153357 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:00.653205 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:00.653291 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:00.653625 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:01.153283 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:01.153360 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:01.153628 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:01.153671 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:01.653416 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:01.653497 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:01.653884 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:02.153557 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:02.153633 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:02.154010 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:02.652736 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:02.652830 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:02.653106 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:03.152802 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:03.152877 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:03.153215 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:03.652921 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:03.652999 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:03.653309 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:03.653358 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:04.152885 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:04.152955 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:04.153320 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:04.653344 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:04.653416 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:04.653746 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:05.153560 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:05.153640 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:05.153974 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:05.652768 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:05.652867 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:05.653179 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:06.152859 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:06.152935 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:06.153224 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:06.153272 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:06.652921 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:06.653000 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:06.653306 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:07.152883 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:07.152957 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:07.153227 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:07.652826 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:07.652898 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:07.653243 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:08.152838 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:08.152914 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:08.153262 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:08.153318 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:08.652911 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:08.652979 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:08.653282 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:09.152916 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:09.152986 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:09.153263 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:09.652875 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:09.652954 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:09.653284 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:10.152865 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:10.152942 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:10.153226 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:10.653112 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:10.653192 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:10.653511 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:10.653577 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:11.153350 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:11.153429 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:11.153777 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:11.653085 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:11.653162 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:11.653429 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:12.152806 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:12.152904 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:12.153213 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:12.652905 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:12.652980 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:12.653311 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:13.152912 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:13.152981 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:13.153307 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:13.153367 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:13.653051 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:13.653127 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:13.653415 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:14.152821 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:14.152917 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:14.153215 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:14.653028 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:14.653106 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:14.653360 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:15.152838 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:15.152912 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:15.153210 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:15.653006 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:15.653081 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:15.653415 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:15.653467 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:16.152865 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:16.152934 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:16.153234 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:16.652830 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:16.652908 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:16.653204 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:17.152901 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:17.152974 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:17.153276 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:17.652850 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:17.652923 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:17.653180 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:18.152820 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:18.152893 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:18.153221 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:18.153274 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:18.652906 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:18.652988 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:18.653283 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:19.152887 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:19.152961 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:19.153207 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:19.652913 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:19.652992 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:19.653321 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:20.153018 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:20.153092 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:20.153437 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:20.153501 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:20.653011 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:20.653093 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:20.653372 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:21.153069 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:21.153153 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:21.153484 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:21.652850 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:21.652938 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:21.653322 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:22.153458 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:22.153530 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:22.153790 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:22.153833 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:22.653650 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:22.653724 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:22.654057 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:23.152779 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:23.152853 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:23.153175 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:23.652866 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:23.652964 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:23.653280 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:24.152829 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:24.152918 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:24.153276 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:24.653129 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:24.653203 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:24.653539 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:24.653595 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:25.153181 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:25.153256 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:25.153525 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:25.653492 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:25.653572 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:25.653896 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:26.153709 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:26.153785 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:26.154149 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:26.653439 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:26.653511 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:26.653845 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:26.653912 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:27.153641 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:27.153711 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:27.154059 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:27.653737 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:27.653813 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:27.654171 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:28.152737 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:28.152824 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:28.153136 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:28.652857 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:28.652929 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:28.653267 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:29.152816 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:29.152890 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:29.153304 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:29.153363 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:29.652877 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:29.652949 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:29.653201 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:30.152847 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:30.152926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:30.153286 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:30.653142 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:30.653226 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:30.653576 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:31.152865 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:31.152937 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:31.153239 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:31.652853 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:31.652930 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:31.653285 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:31.653341 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:32.153002 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:32.153077 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:32.153389 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:32.652855 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:32.652928 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:32.653191 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:33.152906 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:33.152985 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:33.153320 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:33.653030 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:33.653112 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:33.653463 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:33.653523 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:34.152867 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:34.152936 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:34.153191 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:34.653266 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:34.653343 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:34.653688 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:35.153480 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:35.153562 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:35.153920 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:35.653700 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:35.653779 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:35.654078 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:35.654124 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:36.152813 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:36.152899 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:36.153244 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:36.652824 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:36.652902 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:36.653244 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:37.152873 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:37.152947 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:37.153200 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:37.652845 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:37.652924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:37.653218 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:38.152831 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:38.152912 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:38.153208 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:38.153253 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:38.652887 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:38.652966 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:38.653228 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:39.152830 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:39.152913 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:39.153278 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:39.652866 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:39.652946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:39.653299 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:40.152861 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:40.152934 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:40.153228 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:40.153287 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:40.653031 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:40.653107 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:40.653447 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:41.152846 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:41.152920 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:41.153249 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:41.652882 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:41.652956 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:41.653222 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:42.152858 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:42.152950 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:42.153290 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:42.153350 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:42.653039 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:42.653114 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:42.653443 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:43.152861 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:43.152929 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:43.153196 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:43.652842 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:43.652917 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:43.653298 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:44.153021 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:44.153098 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:44.153446 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:44.153502 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:44.653394 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:44.653463 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:44.653758 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:45.153716 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:45.153844 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:45.154316 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:45.653443 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:45.653522 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:45.653863 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:46.153623 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:46.153701 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:46.153971 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:46.154014 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:46.653765 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:46.653843 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:46.654187 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:47.152841 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:47.152918 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:47.153244 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:47.652861 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:47.652938 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:47.653190 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:48.152864 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:48.152954 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:48.153355 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:48.653077 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:48.653151 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:48.653475 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:48.653535 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:49.152855 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:49.152929 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:49.153189 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:49.652829 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:49.652905 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:49.653255 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:50.152968 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:50.153052 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:50.153380 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:50.653140 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:50.653211 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:50.653679 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:50.653731 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:51.153473 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:51.153550 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:51.154738 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=1
	I1219 06:09:51.653546 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:51.653618 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:51.653958 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:52.153274 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:52.153349 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:52.153606 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:52.653351 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:52.653426 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:52.653752 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:52.653808 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:53.153430 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:53.153501 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:53.153810 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:53.653040 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:53.653137 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:53.653483 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:54.152833 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:54.152911 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:54.153264 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:54.652950 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:54.653032 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:54.653335 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:55.152873 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:55.152958 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:55.153273 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:55.153315 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:55.653562 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:55.653634 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:55.653988 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:56.152721 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:56.152824 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:56.153183 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:56.652868 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:56.652946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:56.653204 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:57.152895 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:57.152971 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:57.153299 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:57.153359 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:57.652826 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:57.652904 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:57.653235 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:58.152872 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:58.152942 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:58.153268 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:58.652975 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:58.653058 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:58.653396 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:59.153105 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:59.153184 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:59.153474 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:59.153520 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:59.652864 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:59.652944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:59.653267 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:00.152931 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:00.153036 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:00.153356 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:00.653257 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:00.653334 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:00.653658 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:01.153453 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:01.153528 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:01.153794 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:01.153845 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:01.653619 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:01.653697 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:01.654088 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:02.153731 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:02.153810 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:02.154155 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:02.652784 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:02.652868 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:02.653195 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:03.152838 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:03.152917 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:03.153278 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:03.652980 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:03.653056 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:03.653404 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:03.653465 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:04.152871 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:04.152950 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:04.153292 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:04.653187 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:04.653267 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:04.653580 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:05.152837 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:05.152916 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:05.153255 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:05.652984 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:05.653057 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:05.653348 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:06.153037 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:06.153117 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:06.153467 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:06.153522 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:06.653186 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:06.653261 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:06.653599 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:07.152860 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:07.152929 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:07.153189 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:07.652828 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:07.652907 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:07.653267 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:08.152859 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:08.152944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:08.153282 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:08.652877 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:08.652946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:08.653201 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:08.653242 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:09.152822 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:09.152898 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:09.153263 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:09.652859 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:09.652947 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:09.653294 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:10.152871 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:10.152950 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:10.153225 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:10.653128 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:10.653226 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:10.653575 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:10.653636 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:11.153427 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:11.153513 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:11.153854 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:11.653325 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:11.653393 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:11.653695 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:12.153493 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:12.153567 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:12.153867 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:12.653672 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:12.653754 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:12.654079 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:12.654129 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:13.152786 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:13.152870 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:13.153265 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:13.652847 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:13.652923 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:13.653234 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:14.152794 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:14.152866 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:14.153184 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:14.653147 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:14.653224 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:14.653478 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:15.152818 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:15.152898 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:15.153253 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:15.153301 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:15.653095 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:15.653176 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:15.653536 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:16.152872 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:16.152948 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:16.153210 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:16.652894 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:16.652982 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:16.653351 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:17.152883 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:17.152963 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:17.153312 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:17.153367 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:17.652869 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:17.652943 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:17.653235 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:18.152825 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:18.152903 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:18.153264 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:18.652851 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:18.652930 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:18.653272 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:19.152844 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:19.152921 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:19.153179 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:19.652851 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:19.652930 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:19.653284 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:19.653346 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:20.152854 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:20.152932 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:20.153299 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:20.653045 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:20.653125 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:20.653445 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:21.153150 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:21.153227 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:21.153575 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:21.652847 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:21.652931 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:21.653260 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:22.152864 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:22.152940 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:22.153207 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:22.153260 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:22.652895 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:22.652974 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:22.653308 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:23.152830 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:23.152911 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:23.153198 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:23.653479 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:23.653551 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:23.653818 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:24.153710 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:24.153800 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:24.154142 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:24.154201 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:24.653230 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:24.653310 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:24.653643 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:25.153415 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:25.153494 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:25.153825 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:25.652780 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:25.652869 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:25.653238 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:26.152955 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:26.153029 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:26.153332 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:26.652857 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:26.652926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:26.653203 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:26.653244 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:27.152819 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:27.152893 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:27.153237 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:27.652953 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:27.653040 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:27.653394 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:28.152864 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:28.152939 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:28.153198 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:28.652846 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:28.652921 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:28.653239 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:28.653296 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:29.153007 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:29.153109 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:29.153490 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:29.652904 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:29.653002 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:29.653393 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:30.152859 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:30.152941 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:30.153290 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:30.653036 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:30.653110 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:30.653469 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:30.653528 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:31.152866 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:31.152940 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:31.153201 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:31.652841 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:31.652917 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:31.653256 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:32.152981 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:32.153059 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:32.153421 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:32.652862 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:32.652934 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:32.653184 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:33.152815 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:33.152902 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:33.153207 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:33.153256 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:33.652858 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:33.652936 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:33.653273 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:34.153699 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:34.153778 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:34.154156 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:34.652927 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:34.653004 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:34.653344 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:35.152944 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:35.153053 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:35.153407 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:35.153473 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:35.653031 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:35.653108 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:35.653439 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:36.152985 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:36.153058 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:36.153410 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:36.652997 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:36.653074 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:36.653385 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:37.152858 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:37.152927 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:37.153178 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:37.652820 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:37.652898 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:37.653238 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:37.653293 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:38.152835 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:38.152918 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:38.153307 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:38.652893 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:38.652984 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:38.653356 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:39.152830 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:39.152909 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:39.153250 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:39.652830 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:39.652914 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:39.653251 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:40.152861 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:40.152939 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:40.153215 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:40.153267 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:40.653044 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:40.653127 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:40.653472 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:41.152846 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:41.152924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:41.153236 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:41.652860 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:41.652938 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:41.653202 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:42.152888 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:42.152978 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:42.153360 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:42.153425 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:42.653108 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:42.653190 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:42.653535 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:43.153293 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:43.153377 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:43.153699 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:43.653511 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:43.653596 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:43.653946 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:44.153626 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:44.153701 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:44.154058 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:44.154116 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:44.653518 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:44.653586 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:44.653839 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:45.153714 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:45.153803 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:45.154242 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:45.653103 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:45.653182 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:45.653567 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:46.152856 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:46.152937 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:46.153239 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:46.652826 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:46.652901 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:46.653220 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:46.653276 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:47.153000 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:47.153090 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:47.153484 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:47.652874 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:47.652945 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:47.653216 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:48.152829 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:48.152923 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:48.153272 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:48.653001 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:48.653085 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:48.653425 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:48.653480 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:49.152866 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:49.152944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:49.153206 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:49.652909 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:49.652997 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:49.653352 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:50.152932 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:50.153010 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:50.153347 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:50.653023 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:50.653100 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:50.653383 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:51.153061 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:51.153137 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:51.153452 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:51.153512 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:51.653210 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:51.653296 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:51.653657 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:52.153489 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:52.153557 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:52.153876 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:52.653692 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:52.653768 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:52.654090 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:53.152787 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:53.152864 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:53.153157 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:53.652784 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:53.652862 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:53.653125 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:53.653167 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:54.152851 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:54.152939 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:54.153283 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:54.653187 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:54.653265 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:54.653642 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:55.153555 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:55.153638 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:55.153984 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:55.652996 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:55.653078 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:55.653399 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:55.653461 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:56.153144 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:56.153229 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:56.153575 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:56.653124 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:56.653197 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:56.653482 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:57.152912 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:57.152988 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:57.153306 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:57.652848 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:57.652930 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:57.653287 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:58.152883 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:58.152951 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:58.153228 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:58.153273 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:58.652822 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:58.652903 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:58.653236 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:59.152792 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:59.152869 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:59.153185 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:59.652732 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:59.652827 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:59.653140 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:00.152931 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:00.153022 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:00.153339 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:00.153391 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:00.653248 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:00.653323 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:00.653669 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:01.153449 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:01.153521 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:01.153868 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:01.653237 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:01.653336 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:01.653684 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:02.153489 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:02.153575 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:02.153909 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:02.153978 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:02.653742 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:02.653822 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:02.654154 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:03.152847 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:03.152925 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:03.153263 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:03.652860 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:03.652939 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:03.653311 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:04.152862 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:04.152937 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:04.153190 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:04.653274 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:04.653353 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:04.653687 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:04.653753 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:05.153511 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:05.153585 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:05.153947 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:05.652710 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:05.652796 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:05.653061 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:06.152795 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:06.152871 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:06.153232 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:06.652955 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:06.653039 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:06.653379 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:07.152874 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:07.152959 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:07.153220 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:07.153260 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:07.652857 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:07.652932 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:07.653275 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:08.152847 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:08.152921 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:08.153280 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:08.652915 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:08.652997 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:08.653258 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:09.152863 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:09.152938 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:09.153313 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:09.153369 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:09.653051 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:09.653130 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:09.653466 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:10.152899 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:10.152977 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:10.153296 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:10.653043 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:10.653165 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:10.653488 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:11.152868 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:11.152944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:11.153273 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:11.653413 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:11.653483 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:11.653817 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:11.653864 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:12.153612 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:12.153686 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:12.154026 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:12.652742 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:12.652850 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:12.653128 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:13.152820 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:13.152886 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:13.153131 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:13.652843 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:13.652924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:13.653273 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:14.152867 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:14.152944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:14.153275 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:14.153331 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:14.653226 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:14.653309 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:14.653648 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:15.153413 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:15.153488 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:15.153804 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:15.652744 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:15.652844 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:15.653142 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:16.152784 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:16.152853 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:16.153159 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:16.652816 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:16.652891 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:16.653186 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:16.653233 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:17.152845 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:17.152926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:17.153251 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:17.652870 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:17.652936 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:17.653201 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:18.152833 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:18.152914 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:18.153264 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:18.652945 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:18.653025 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:18.653318 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:18.653380 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:19.152875 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:19.152949 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:19.153231 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:19.652844 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:19.652920 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:19.653256 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:20.152860 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:20.152935 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:20.153299 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:20.653014 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:20.653092 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:20.653351 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:21.152845 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:21.152918 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:21.153277 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:21.153341 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:21.653010 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:21.653087 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:21.653481 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:22.152850 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:22.152924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:22.153186 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:22.652828 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:22.652906 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:22.653231 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:23.152837 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:23.152923 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:23.153272 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:23.652867 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:23.652942 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:23.653201 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:23.653241 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:24.152900 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:24.152984 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:24.153313 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:24.653189 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:24.653270 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:24.653611 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:25.152919 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:25.152989 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:25.153298 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:25.653028 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:25.653101 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:25.653438 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:25.653492 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:26.153176 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:26.153256 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:26.153570 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:26.652912 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:26.652986 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:26.653378 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:27.152845 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:27.152921 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:27.153259 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:27.652839 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:27.652922 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:27.653245 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:28.152867 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:28.152938 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:28.153233 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:28.153276 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:28.652833 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:28.652907 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:28.653249 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:29.152989 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:29.153064 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:29.153396 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:29.652869 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:29.652941 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:29.653239 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:30.152845 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:30.152926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:30.153258 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:30.153317 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:30.653045 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:30.653118 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:30.653429 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:31.152869 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:31.152943 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:31.153207 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:31.652855 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:31.652930 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:31.653292 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:32.152836 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:32.152919 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:32.153241 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:32.652864 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:32.652940 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:32.653216 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:32.653263 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:33.152791 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:33.152875 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:33.153206 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:33.652896 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:33.652977 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:33.653320 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:34.152876 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:34.152959 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:34.153289 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:34.653352 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:34.653430 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:34.653807 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:34.653869 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:35.153637 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:35.153718 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:35.154044 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:35.652946 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:35.653021 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:35.653340 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:36.152965 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:36.153049 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:36.153384 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:36.653133 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:36.653213 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:36.653535 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:37.153283 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:37.153355 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:37.153669 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:37.153731 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:37.653516 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:37.653597 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:37.653938 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:38.153755 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:38.153833 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:38.154248 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:38.652881 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:38.652956 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:38.653272 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:39.152844 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:39.152925 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:39.153296 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:39.652881 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:39.652964 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:39.653289 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:39.653347 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:40.152873 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:40.152948 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:40.153212 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:40.653026 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:40.653107 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:40.653447 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:41.152846 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:41.152921 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:41.153249 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:41.652949 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:41.653030 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:41.653297 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:42.152907 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:42.153021 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:42.153435 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:42.153505 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:42.653182 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:42.653258 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:42.653594 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:43.152852 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:43.152926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:43.153183 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:43.652858 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:43.652938 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:43.653276 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:44.152855 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:44.152938 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:44.153264 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:44.653236 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:44.653312 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:44.653574 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:44.653614 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:45.153508 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:45.153630 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:45.154114 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:45.653037 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:45.653120 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:45.653478 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:46.152854 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:46.152928 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:46.153280 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:46.652857 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:46.652941 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:46.653286 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:47.152995 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:47.153070 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:47.153369 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:47.153415 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:47.652884 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:47.652954 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:47.653305 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:48.152817 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:48.152895 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:48.153237 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:48.652848 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:48.652925 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:48.653243 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:49.152858 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:49.152943 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:49.153213 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:49.652807 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:49.652884 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:49.653230 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:49.653285 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:50.152972 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:50.153050 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:50.153344 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:50.653187 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:50.653264 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:50.653522 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:51.153213 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:51.153287 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:51.153583 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:51.653360 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:51.653435 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:51.653779 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:51.653840 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:52.153070 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:52.153143 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:52.153403 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:52.653101 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:52.653190 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:52.653483 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:53.152800 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:53.152878 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:53.153196 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:53.652868 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:53.652937 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:53.653274 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:54.152820 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:54.152901 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:54.153215 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:54.153274 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:54.653291 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:54.653363 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:54.653706 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:55.152998 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:55.153089 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:55.153429 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:55.653064 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:55.653125 2059048 node_ready.go:38] duration metric: took 6m0.000540604s for node "functional-006924" to be "Ready" ...
	I1219 06:11:55.656290 2059048 out.go:203] 
	W1219 06:11:55.659114 2059048 out.go:285] X Exiting due to GUEST_START: failed to start node: wait 6m0s for node: waiting for node to be ready: WaitNodeCondition: context deadline exceeded
	X Exiting due to GUEST_START: failed to start node: wait 6m0s for node: waiting for node to be ready: WaitNodeCondition: context deadline exceeded
	W1219 06:11:55.659135 2059048 out.go:285] * 
	* 
	W1219 06:11:55.661307 2059048 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1219 06:11:55.664349 2059048 out.go:203] 

                                                
                                                
** /stderr **
functional_test.go:676: failed to soft start minikube. args "out/minikube-linux-arm64 start -p functional-006924 --alsologtostderr -v=8": exit status 80
functional_test.go:678: soft start took 6m5.683033805s for "functional-006924" cluster.
I1219 06:11:56.175832 2000386 config.go:182] Loaded profile config "functional-006924": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:224: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/SoftStart]: network settings <======
helpers_test.go:231: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:239: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/SoftStart]: docker inspect <======
helpers_test.go:240: (dbg) Run:  docker inspect functional-006924
helpers_test.go:244: (dbg) docker inspect functional-006924:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6",
	        "Created": "2025-12-19T05:57:32.987616309Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 2053574,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-19T05:57:33.050252475Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:1411dfa4fea1291ce69fcd55acb99f3fbff3e701cee30fdd4f0b2561ac0ef6b0",
	        "ResolvConfPath": "/var/lib/docker/containers/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6/hostname",
	        "HostsPath": "/var/lib/docker/containers/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6/hosts",
	        "LogPath": "/var/lib/docker/containers/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6-json.log",
	        "Name": "/functional-006924",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-006924:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-006924",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6",
	                "LowerDir": "/var/lib/docker/overlay2/37e64f46ab514c62069e4338bcac4816059c7746e8935882d7526ec5f3a01f73-init/diff:/var/lib/docker/overlay2/00358d85eab3b52f9d297862c5ac97673efd866f7bb8f8781bf0c1744f50abc5/diff",
	                "MergedDir": "/var/lib/docker/overlay2/37e64f46ab514c62069e4338bcac4816059c7746e8935882d7526ec5f3a01f73/merged",
	                "UpperDir": "/var/lib/docker/overlay2/37e64f46ab514c62069e4338bcac4816059c7746e8935882d7526ec5f3a01f73/diff",
	                "WorkDir": "/var/lib/docker/overlay2/37e64f46ab514c62069e4338bcac4816059c7746e8935882d7526ec5f3a01f73/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-006924",
	                "Source": "/var/lib/docker/volumes/functional-006924/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-006924",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-006924",
	                "name.minikube.sigs.k8s.io": "functional-006924",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "c06ab2bd44169716d410789ed39ed6e7c04e20cbf7fddb96691439282b9c97ca",
	            "SandboxKey": "/var/run/docker/netns/c06ab2bd4416",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34704"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34705"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34708"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34706"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34707"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-006924": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "42:2f:87:6a:a8:7b",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "f63e8dc2cff83663f8a4d14108f192e61e457410fa4fc720cd9630dbf354815d",
	                    "EndpointID": "aa2b1cbd90d5c1f6130481423d97f82d974d4197e41ad0dbe3b7e51b22c8b4cc",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-006924",
	                        "651d0d6ef1db"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:248: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-006924 -n functional-006924
helpers_test.go:248: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-006924 -n functional-006924: exit status 2 (337.243568ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:248: status error: exit status 2 (may be ok)
helpers_test.go:253: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/SoftStart FAILED: start of post-mortem logs <<<
helpers_test.go:254: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/SoftStart]: minikube logs <======
helpers_test.go:256: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 logs -n 25
helpers_test.go:256: (dbg) Done: out/minikube-linux-arm64 -p functional-006924 logs -n 25: (1.005377999s)
helpers_test.go:261: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/SoftStart logs: 
-- stdout --
	
	==> Audit <==
	┌────────────────┬───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│    COMMAND     │                                                                         ARGS                                                                          │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├────────────────┼───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ image          │ functional-125117 image rm kicbase/echo-server:functional-125117 --alsologtostderr                                                                    │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image ls                                                                                                                            │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image load /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr                             │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image ls                                                                                                                            │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image save --daemon kicbase/echo-server:functional-125117 --alsologtostderr                                                         │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ ssh            │ functional-125117 ssh sudo cat /etc/ssl/certs/2000386.pem                                                                                             │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ ssh            │ functional-125117 ssh sudo cat /usr/share/ca-certificates/2000386.pem                                                                                 │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ ssh            │ functional-125117 ssh sudo cat /etc/ssl/certs/51391683.0                                                                                              │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ ssh            │ functional-125117 ssh sudo cat /etc/ssl/certs/20003862.pem                                                                                            │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ ssh            │ functional-125117 ssh sudo cat /usr/share/ca-certificates/20003862.pem                                                                                │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ ssh            │ functional-125117 ssh sudo cat /etc/ssl/certs/3ec20f2e.0                                                                                              │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ ssh            │ functional-125117 ssh sudo cat /etc/test/nested/copy/2000386/hosts                                                                                    │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image ls --format short --alsologtostderr                                                                                           │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image ls --format yaml --alsologtostderr                                                                                            │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ ssh            │ functional-125117 ssh pgrep buildkitd                                                                                                                 │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │                     │
	│ image          │ functional-125117 image build -t localhost/my-image:functional-125117 testdata/build --alsologtostderr                                                │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image ls                                                                                                                            │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image ls --format json --alsologtostderr                                                                                            │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image ls --format table --alsologtostderr                                                                                           │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ update-context │ functional-125117 update-context --alsologtostderr -v=2                                                                                               │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ update-context │ functional-125117 update-context --alsologtostderr -v=2                                                                                               │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ update-context │ functional-125117 update-context --alsologtostderr -v=2                                                                                               │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ delete         │ -p functional-125117                                                                                                                                  │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ start          │ -p functional-006924 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-rc.1 │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │                     │
	│ start          │ -p functional-006924 --alsologtostderr -v=8                                                                                                           │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:05 UTC │                     │
	└────────────────┴───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/19 06:05:50
	Running on machine: ip-172-31-30-239
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1219 06:05:50.537990 2059048 out.go:360] Setting OutFile to fd 1 ...
	I1219 06:05:50.538849 2059048 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 06:05:50.538894 2059048 out.go:374] Setting ErrFile to fd 2...
	I1219 06:05:50.538913 2059048 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 06:05:50.539188 2059048 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22230-1998525/.minikube/bin
	I1219 06:05:50.539610 2059048 out.go:368] Setting JSON to false
	I1219 06:05:50.540502 2059048 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":38897,"bootTime":1766085454,"procs":155,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1219 06:05:50.540601 2059048 start.go:143] virtualization:  
	I1219 06:05:50.544140 2059048 out.go:179] * [functional-006924] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1219 06:05:50.547152 2059048 out.go:179]   - MINIKUBE_LOCATION=22230
	I1219 06:05:50.547218 2059048 notify.go:221] Checking for updates...
	I1219 06:05:50.550931 2059048 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1219 06:05:50.553869 2059048 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22230-1998525/kubeconfig
	I1219 06:05:50.556730 2059048 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22230-1998525/.minikube
	I1219 06:05:50.559634 2059048 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1219 06:05:50.562450 2059048 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1219 06:05:50.565702 2059048 config.go:182] Loaded profile config "functional-006924": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1219 06:05:50.565828 2059048 driver.go:422] Setting default libvirt URI to qemu:///system
	I1219 06:05:50.590709 2059048 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1219 06:05:50.590846 2059048 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1219 06:05:50.653898 2059048 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-19 06:05:50.644590744 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1219 06:05:50.654020 2059048 docker.go:319] overlay module found
	I1219 06:05:50.657204 2059048 out.go:179] * Using the docker driver based on existing profile
	I1219 06:05:50.660197 2059048 start.go:309] selected driver: docker
	I1219 06:05:50.660214 2059048 start.go:928] validating driver "docker" against &{Name:functional-006924 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APISer
verName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableC
oreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1219 06:05:50.660310 2059048 start.go:939] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1219 06:05:50.660408 2059048 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1219 06:05:50.713439 2059048 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-19 06:05:50.704333478 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1219 06:05:50.713872 2059048 cni.go:84] Creating CNI manager for ""
	I1219 06:05:50.713935 2059048 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1219 06:05:50.713992 2059048 start.go:353] cluster config:
	{Name:functional-006924 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Con
tainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath:
StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1219 06:05:50.717210 2059048 out.go:179] * Starting "functional-006924" primary control-plane node in "functional-006924" cluster
	I1219 06:05:50.719980 2059048 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1219 06:05:50.722935 2059048 out.go:179] * Pulling base image v0.0.48-1765966054-22186 ...
	I1219 06:05:50.726070 2059048 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1219 06:05:50.726124 2059048 preload.go:203] Found local preload: /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4
	I1219 06:05:50.726135 2059048 cache.go:65] Caching tarball of preloaded images
	I1219 06:05:50.726179 2059048 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon
	I1219 06:05:50.726225 2059048 preload.go:238] Found /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4 in cache, skipping download
	I1219 06:05:50.726236 2059048 cache.go:68] Finished verifying existence of preloaded tar for v1.35.0-rc.1 on containerd
	I1219 06:05:50.726339 2059048 profile.go:143] Saving config to /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/config.json ...
	I1219 06:05:50.745888 2059048 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon, skipping pull
	I1219 06:05:50.745915 2059048 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 exists in daemon, skipping load
	I1219 06:05:50.745932 2059048 cache.go:243] Successfully downloaded all kic artifacts
	I1219 06:05:50.745963 2059048 start.go:360] acquireMachinesLock for functional-006924: {Name:mkc84f48e83d18024791d45db780f3ccd746613a Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1219 06:05:50.746023 2059048 start.go:364] duration metric: took 37.752µs to acquireMachinesLock for "functional-006924"
	I1219 06:05:50.746049 2059048 start.go:96] Skipping create...Using existing machine configuration
	I1219 06:05:50.746059 2059048 fix.go:54] fixHost starting: 
	I1219 06:05:50.746334 2059048 cli_runner.go:164] Run: docker container inspect functional-006924 --format={{.State.Status}}
	I1219 06:05:50.762745 2059048 fix.go:112] recreateIfNeeded on functional-006924: state=Running err=<nil>
	W1219 06:05:50.762777 2059048 fix.go:138] unexpected machine state, will restart: <nil>
	I1219 06:05:50.765990 2059048 out.go:252] * Updating the running docker "functional-006924" container ...
	I1219 06:05:50.766020 2059048 machine.go:94] provisionDockerMachine start ...
	I1219 06:05:50.766101 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:50.782668 2059048 main.go:144] libmachine: Using SSH client type: native
	I1219 06:05:50.783000 2059048 main.go:144] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db5e0] 0x3ddae0 <nil>  [] 0s} 127.0.0.1 34704 <nil> <nil>}
	I1219 06:05:50.783017 2059048 main.go:144] libmachine: About to run SSH command:
	hostname
	I1219 06:05:50.940618 2059048 main.go:144] libmachine: SSH cmd err, output: <nil>: functional-006924
	
	I1219 06:05:50.940641 2059048 ubuntu.go:182] provisioning hostname "functional-006924"
	I1219 06:05:50.940708 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:50.964854 2059048 main.go:144] libmachine: Using SSH client type: native
	I1219 06:05:50.965181 2059048 main.go:144] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db5e0] 0x3ddae0 <nil>  [] 0s} 127.0.0.1 34704 <nil> <nil>}
	I1219 06:05:50.965199 2059048 main.go:144] libmachine: About to run SSH command:
	sudo hostname functional-006924 && echo "functional-006924" | sudo tee /etc/hostname
	I1219 06:05:51.129720 2059048 main.go:144] libmachine: SSH cmd err, output: <nil>: functional-006924
	
	I1219 06:05:51.129816 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:51.147357 2059048 main.go:144] libmachine: Using SSH client type: native
	I1219 06:05:51.147663 2059048 main.go:144] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db5e0] 0x3ddae0 <nil>  [] 0s} 127.0.0.1 34704 <nil> <nil>}
	I1219 06:05:51.147686 2059048 main.go:144] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-006924' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-006924/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-006924' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1219 06:05:51.301923 2059048 main.go:144] libmachine: SSH cmd err, output: <nil>: 
	I1219 06:05:51.301949 2059048 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/22230-1998525/.minikube CaCertPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/22230-1998525/.minikube}
	I1219 06:05:51.301977 2059048 ubuntu.go:190] setting up certificates
	I1219 06:05:51.301985 2059048 provision.go:84] configureAuth start
	I1219 06:05:51.302047 2059048 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-006924
	I1219 06:05:51.323653 2059048 provision.go:143] copyHostCerts
	I1219 06:05:51.323700 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.pem
	I1219 06:05:51.323742 2059048 exec_runner.go:144] found /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.pem, removing ...
	I1219 06:05:51.323756 2059048 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.pem
	I1219 06:05:51.323832 2059048 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.pem (1078 bytes)
	I1219 06:05:51.323915 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/22230-1998525/.minikube/cert.pem
	I1219 06:05:51.323932 2059048 exec_runner.go:144] found /home/jenkins/minikube-integration/22230-1998525/.minikube/cert.pem, removing ...
	I1219 06:05:51.323937 2059048 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22230-1998525/.minikube/cert.pem
	I1219 06:05:51.323964 2059048 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/22230-1998525/.minikube/cert.pem (1123 bytes)
	I1219 06:05:51.324003 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/22230-1998525/.minikube/key.pem
	I1219 06:05:51.324018 2059048 exec_runner.go:144] found /home/jenkins/minikube-integration/22230-1998525/.minikube/key.pem, removing ...
	I1219 06:05:51.324022 2059048 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22230-1998525/.minikube/key.pem
	I1219 06:05:51.324044 2059048 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/22230-1998525/.minikube/key.pem (1671 bytes)
	I1219 06:05:51.324090 2059048 provision.go:117] generating server cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca-key.pem org=jenkins.functional-006924 san=[127.0.0.1 192.168.49.2 functional-006924 localhost minikube]
	I1219 06:05:51.441821 2059048 provision.go:177] copyRemoteCerts
	I1219 06:05:51.441886 2059048 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1219 06:05:51.441926 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:51.459787 2059048 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:05:51.570296 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I1219 06:05:51.570372 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1219 06:05:51.588363 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server.pem -> /etc/docker/server.pem
	I1219 06:05:51.588477 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1219 06:05:51.605684 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I1219 06:05:51.605798 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1219 06:05:51.623473 2059048 provision.go:87] duration metric: took 321.473451ms to configureAuth
	I1219 06:05:51.623556 2059048 ubuntu.go:206] setting minikube options for container-runtime
	I1219 06:05:51.623741 2059048 config.go:182] Loaded profile config "functional-006924": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1219 06:05:51.623756 2059048 machine.go:97] duration metric: took 857.728961ms to provisionDockerMachine
	I1219 06:05:51.623765 2059048 start.go:293] postStartSetup for "functional-006924" (driver="docker")
	I1219 06:05:51.623788 2059048 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1219 06:05:51.623849 2059048 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1219 06:05:51.623892 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:51.641371 2059048 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:05:51.760842 2059048 ssh_runner.go:195] Run: cat /etc/os-release
	I1219 06:05:51.764225 2059048 command_runner.go:130] > PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	I1219 06:05:51.764245 2059048 command_runner.go:130] > NAME="Debian GNU/Linux"
	I1219 06:05:51.764250 2059048 command_runner.go:130] > VERSION_ID="12"
	I1219 06:05:51.764255 2059048 command_runner.go:130] > VERSION="12 (bookworm)"
	I1219 06:05:51.764259 2059048 command_runner.go:130] > VERSION_CODENAME=bookworm
	I1219 06:05:51.764263 2059048 command_runner.go:130] > ID=debian
	I1219 06:05:51.764268 2059048 command_runner.go:130] > HOME_URL="https://www.debian.org/"
	I1219 06:05:51.764273 2059048 command_runner.go:130] > SUPPORT_URL="https://www.debian.org/support"
	I1219 06:05:51.764281 2059048 command_runner.go:130] > BUG_REPORT_URL="https://bugs.debian.org/"
	I1219 06:05:51.764323 2059048 main.go:144] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1219 06:05:51.764339 2059048 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1219 06:05:51.764350 2059048 filesync.go:126] Scanning /home/jenkins/minikube-integration/22230-1998525/.minikube/addons for local assets ...
	I1219 06:05:51.764404 2059048 filesync.go:126] Scanning /home/jenkins/minikube-integration/22230-1998525/.minikube/files for local assets ...
	I1219 06:05:51.764485 2059048 filesync.go:149] local asset: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem -> 20003862.pem in /etc/ssl/certs
	I1219 06:05:51.764491 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem -> /etc/ssl/certs/20003862.pem
	I1219 06:05:51.764572 2059048 filesync.go:149] local asset: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/test/nested/copy/2000386/hosts -> hosts in /etc/test/nested/copy/2000386
	I1219 06:05:51.764576 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/test/nested/copy/2000386/hosts -> /etc/test/nested/copy/2000386/hosts
	I1219 06:05:51.764619 2059048 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/2000386
	I1219 06:05:51.772196 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem --> /etc/ssl/certs/20003862.pem (1708 bytes)
	I1219 06:05:51.790438 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/test/nested/copy/2000386/hosts --> /etc/test/nested/copy/2000386/hosts (40 bytes)
	I1219 06:05:51.808099 2059048 start.go:296] duration metric: took 184.303334ms for postStartSetup
	I1219 06:05:51.808203 2059048 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1219 06:05:51.808277 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:51.825566 2059048 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:05:51.929610 2059048 command_runner.go:130] > 14%
	I1219 06:05:51.930200 2059048 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1219 06:05:51.934641 2059048 command_runner.go:130] > 169G
	I1219 06:05:51.935117 2059048 fix.go:56] duration metric: took 1.189053781s for fixHost
	I1219 06:05:51.935139 2059048 start.go:83] releasing machines lock for "functional-006924", held for 1.189101272s
	I1219 06:05:51.935225 2059048 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-006924
	I1219 06:05:51.954055 2059048 ssh_runner.go:195] Run: cat /version.json
	I1219 06:05:51.954105 2059048 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1219 06:05:51.954110 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:51.954164 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:51.979421 2059048 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:05:51.998216 2059048 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:05:52.088735 2059048 command_runner.go:130] > {"iso_version": "v1.37.0-1765846775-22141", "kicbase_version": "v0.0.48-1765966054-22186", "minikube_version": "v1.37.0", "commit": "c344550999bcbb78f38b2df057224788bb2d30b2"}
	I1219 06:05:52.088901 2059048 ssh_runner.go:195] Run: systemctl --version
	I1219 06:05:52.184102 2059048 command_runner.go:130] > <a href="https://github.com/kubernetes/registry.k8s.io">Temporary Redirect</a>.
	I1219 06:05:52.186843 2059048 command_runner.go:130] > systemd 252 (252.39-1~deb12u1)
	I1219 06:05:52.186921 2059048 command_runner.go:130] > +PAM +AUDIT +SELINUX +APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT +QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified
	I1219 06:05:52.187021 2059048 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I1219 06:05:52.191424 2059048 command_runner.go:130] ! stat: cannot statx '/etc/cni/net.d/*loopback.conf*': No such file or directory
	W1219 06:05:52.191590 2059048 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1219 06:05:52.191669 2059048 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1219 06:05:52.199647 2059048 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1219 06:05:52.199671 2059048 start.go:496] detecting cgroup driver to use...
	I1219 06:05:52.199702 2059048 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1219 06:05:52.199771 2059048 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1219 06:05:52.215530 2059048 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1219 06:05:52.228927 2059048 docker.go:218] disabling cri-docker service (if available) ...
	I1219 06:05:52.229039 2059048 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1219 06:05:52.245166 2059048 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1219 06:05:52.258582 2059048 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1219 06:05:52.378045 2059048 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1219 06:05:52.513092 2059048 docker.go:234] disabling docker service ...
	I1219 06:05:52.513180 2059048 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1219 06:05:52.528704 2059048 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1219 06:05:52.542109 2059048 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1219 06:05:52.652456 2059048 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1219 06:05:52.767269 2059048 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1219 06:05:52.781039 2059048 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1219 06:05:52.797281 2059048 command_runner.go:130] > runtime-endpoint: unix:///run/containerd/containerd.sock
	I1219 06:05:52.797396 2059048 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1219 06:05:52.807020 2059048 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1219 06:05:52.816571 2059048 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1219 06:05:52.816661 2059048 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1219 06:05:52.826225 2059048 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1219 06:05:52.835109 2059048 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1219 06:05:52.843741 2059048 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1219 06:05:52.852504 2059048 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1219 06:05:52.860160 2059048 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1219 06:05:52.868883 2059048 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1219 06:05:52.877906 2059048 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1219 06:05:52.887403 2059048 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1219 06:05:52.894024 2059048 command_runner.go:130] > net.bridge.bridge-nf-call-iptables = 1
	I1219 06:05:52.894921 2059048 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1219 06:05:52.902164 2059048 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1219 06:05:53.021703 2059048 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1219 06:05:53.168216 2059048 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1219 06:05:53.168331 2059048 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1219 06:05:53.171951 2059048 command_runner.go:130] >   File: /run/containerd/containerd.sock
	I1219 06:05:53.172022 2059048 command_runner.go:130] >   Size: 0         	Blocks: 0          IO Block: 4096   socket
	I1219 06:05:53.172043 2059048 command_runner.go:130] > Device: 0,72	Inode: 1614        Links: 1
	I1219 06:05:53.172065 2059048 command_runner.go:130] > Access: (0660/srw-rw----)  Uid: (    0/    root)   Gid: (    0/    root)
	I1219 06:05:53.172084 2059048 command_runner.go:130] > Access: 2025-12-19 06:05:53.119867628 +0000
	I1219 06:05:53.172112 2059048 command_runner.go:130] > Modify: 2025-12-19 06:05:53.119867628 +0000
	I1219 06:05:53.172131 2059048 command_runner.go:130] > Change: 2025-12-19 06:05:53.119867628 +0000
	I1219 06:05:53.172148 2059048 command_runner.go:130] >  Birth: -
	I1219 06:05:53.172331 2059048 start.go:564] Will wait 60s for crictl version
	I1219 06:05:53.172432 2059048 ssh_runner.go:195] Run: which crictl
	I1219 06:05:53.175887 2059048 command_runner.go:130] > /usr/local/bin/crictl
	I1219 06:05:53.176199 2059048 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1219 06:05:53.203136 2059048 command_runner.go:130] > Version:  0.1.0
	I1219 06:05:53.203389 2059048 command_runner.go:130] > RuntimeName:  containerd
	I1219 06:05:53.203588 2059048 command_runner.go:130] > RuntimeVersion:  v2.2.0
	I1219 06:05:53.203784 2059048 command_runner.go:130] > RuntimeApiVersion:  v1
	I1219 06:05:53.207710 2059048 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.2.0
	RuntimeApiVersion:  v1
	I1219 06:05:53.207845 2059048 ssh_runner.go:195] Run: containerd --version
	I1219 06:05:53.235328 2059048 command_runner.go:130] > containerd containerd.io v2.2.0 1c4457e00facac03ce1d75f7b6777a7a851e5c41
	I1219 06:05:53.237219 2059048 ssh_runner.go:195] Run: containerd --version
	I1219 06:05:53.254490 2059048 command_runner.go:130] > containerd containerd.io v2.2.0 1c4457e00facac03ce1d75f7b6777a7a851e5c41
	I1219 06:05:53.262101 2059048 out.go:179] * Preparing Kubernetes v1.35.0-rc.1 on containerd 2.2.0 ...
	I1219 06:05:53.264978 2059048 cli_runner.go:164] Run: docker network inspect functional-006924 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1219 06:05:53.280549 2059048 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1219 06:05:53.284647 2059048 command_runner.go:130] > 192.168.49.1	host.minikube.internal
	I1219 06:05:53.284847 2059048 kubeadm.go:884] updating cluster {Name:functional-006924 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APIServerName:minikubeC
A APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false C
ustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1219 06:05:53.284979 2059048 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1219 06:05:53.285048 2059048 ssh_runner.go:195] Run: sudo crictl images --output json
	I1219 06:05:53.307306 2059048 command_runner.go:130] > {
	I1219 06:05:53.307331 2059048 command_runner.go:130] >   "images":  [
	I1219 06:05:53.307335 2059048 command_runner.go:130] >     {
	I1219 06:05:53.307345 2059048 command_runner.go:130] >       "id":  "sha256:b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c",
	I1219 06:05:53.307350 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.307356 2059048 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20250512-df8de77b"
	I1219 06:05:53.307360 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307365 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.307373 2059048 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a"
	I1219 06:05:53.307380 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307385 2059048 command_runner.go:130] >       "size":  "40636774",
	I1219 06:05:53.307391 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.307395 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.307402 2059048 command_runner.go:130] >     },
	I1219 06:05:53.307405 2059048 command_runner.go:130] >     {
	I1219 06:05:53.307417 2059048 command_runner.go:130] >       "id":  "sha256:ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6",
	I1219 06:05:53.307425 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.307431 2059048 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1219 06:05:53.307435 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307441 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.307450 2059048 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"
	I1219 06:05:53.307455 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307460 2059048 command_runner.go:130] >       "size":  "8034419",
	I1219 06:05:53.307463 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.307467 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.307470 2059048 command_runner.go:130] >     },
	I1219 06:05:53.307482 2059048 command_runner.go:130] >     {
	I1219 06:05:53.307492 2059048 command_runner.go:130] >       "id":  "sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf",
	I1219 06:05:53.307496 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.307501 2059048 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.13.1"
	I1219 06:05:53.307505 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307519 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.307528 2059048 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6"
	I1219 06:05:53.307534 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307538 2059048 command_runner.go:130] >       "size":  "21168808",
	I1219 06:05:53.307542 2059048 command_runner.go:130] >       "username":  "nonroot",
	I1219 06:05:53.307546 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.307549 2059048 command_runner.go:130] >     },
	I1219 06:05:53.307552 2059048 command_runner.go:130] >     {
	I1219 06:05:53.307559 2059048 command_runner.go:130] >       "id":  "sha256:271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57",
	I1219 06:05:53.307565 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.307570 2059048 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.6-0"
	I1219 06:05:53.307581 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307585 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.307592 2059048 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890"
	I1219 06:05:53.307598 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307602 2059048 command_runner.go:130] >       "size":  "21749640",
	I1219 06:05:53.307610 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.307614 2059048 command_runner.go:130] >         "value":  "0"
	I1219 06:05:53.307618 2059048 command_runner.go:130] >       },
	I1219 06:05:53.307622 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.307625 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.307631 2059048 command_runner.go:130] >     },
	I1219 06:05:53.307634 2059048 command_runner.go:130] >     {
	I1219 06:05:53.307641 2059048 command_runner.go:130] >       "id":  "sha256:3c6ba27e07aef16adb050828695bfe6206439147b9ade2a2a1777c276bf79a54",
	I1219 06:05:53.307647 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.307653 2059048 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.35.0-rc.1"
	I1219 06:05:53.307666 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307670 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.307682 2059048 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:58367b5c0428495c0c12411fa7a018f5d40fe57307b85d8935b1ed35706ff7ee"
	I1219 06:05:53.307689 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307693 2059048 command_runner.go:130] >       "size":  "24692223",
	I1219 06:05:53.307697 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.307708 2059048 command_runner.go:130] >         "value":  "0"
	I1219 06:05:53.307712 2059048 command_runner.go:130] >       },
	I1219 06:05:53.307716 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.307723 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.307726 2059048 command_runner.go:130] >     },
	I1219 06:05:53.307729 2059048 command_runner.go:130] >     {
	I1219 06:05:53.307736 2059048 command_runner.go:130] >       "id":  "sha256:a34b3483f25ba81aa72f3aeb607a8c756479e8497d8420acbcd2854162ebf84a",
	I1219 06:05:53.307742 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.307748 2059048 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.35.0-rc.1"
	I1219 06:05:53.307753 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307757 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.307765 2059048 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:57ab0f75f58d99f4be7bff7bdda015fcbf1b7c20e58ba2722c8c39f751dc8c98"
	I1219 06:05:53.307769 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307773 2059048 command_runner.go:130] >       "size":  "20672157",
	I1219 06:05:53.307779 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.307783 2059048 command_runner.go:130] >         "value":  "0"
	I1219 06:05:53.307788 2059048 command_runner.go:130] >       },
	I1219 06:05:53.307792 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.307796 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.307799 2059048 command_runner.go:130] >     },
	I1219 06:05:53.307802 2059048 command_runner.go:130] >     {
	I1219 06:05:53.307809 2059048 command_runner.go:130] >       "id":  "sha256:7e3acea3d87aa7ca234514e7f9c10450c7a7f87fc273fc9b5a220e2a2be1ce4e",
	I1219 06:05:53.307813 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.307821 2059048 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.35.0-rc.1"
	I1219 06:05:53.307826 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307830 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.307840 2059048 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:bdd1fa8b53558a2e1967379a36b085c93faf15581e5fa9f212baf679d89c5bb5"
	I1219 06:05:53.307845 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307849 2059048 command_runner.go:130] >       "size":  "22432301",
	I1219 06:05:53.307858 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.307864 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.307867 2059048 command_runner.go:130] >     },
	I1219 06:05:53.307870 2059048 command_runner.go:130] >     {
	I1219 06:05:53.307877 2059048 command_runner.go:130] >       "id":  "sha256:abca4d5226620be2218c3971464a1066651a743008c1db8720353446a4b7bbde",
	I1219 06:05:53.307884 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.307889 2059048 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.35.0-rc.1"
	I1219 06:05:53.307893 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307899 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.307907 2059048 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:8155e3db27c7081abfc8eb5da70820cfeaf0bba7449e45360e8220e670f417d3"
	I1219 06:05:53.307913 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307917 2059048 command_runner.go:130] >       "size":  "15405535",
	I1219 06:05:53.307921 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.307925 2059048 command_runner.go:130] >         "value":  "0"
	I1219 06:05:53.307928 2059048 command_runner.go:130] >       },
	I1219 06:05:53.307932 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.307939 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.307942 2059048 command_runner.go:130] >     },
	I1219 06:05:53.307948 2059048 command_runner.go:130] >     {
	I1219 06:05:53.307955 2059048 command_runner.go:130] >       "id":  "sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1219 06:05:53.307963 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.307967 2059048 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1219 06:05:53.307970 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307974 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.307982 2059048 command_runner.go:130] >         "registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c"
	I1219 06:05:53.307987 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307991 2059048 command_runner.go:130] >       "size":  "267939",
	I1219 06:05:53.307996 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.308000 2059048 command_runner.go:130] >         "value":  "65535"
	I1219 06:05:53.308004 2059048 command_runner.go:130] >       },
	I1219 06:05:53.308011 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.308015 2059048 command_runner.go:130] >       "pinned":  true
	I1219 06:05:53.308020 2059048 command_runner.go:130] >     }
	I1219 06:05:53.308027 2059048 command_runner.go:130] >   ]
	I1219 06:05:53.308030 2059048 command_runner.go:130] > }
	I1219 06:05:53.310449 2059048 containerd.go:627] all images are preloaded for containerd runtime.
	I1219 06:05:53.310472 2059048 containerd.go:534] Images already preloaded, skipping extraction
	I1219 06:05:53.310540 2059048 ssh_runner.go:195] Run: sudo crictl images --output json
	I1219 06:05:53.331271 2059048 command_runner.go:130] > {
	I1219 06:05:53.331288 2059048 command_runner.go:130] >   "images":  [
	I1219 06:05:53.331292 2059048 command_runner.go:130] >     {
	I1219 06:05:53.331304 2059048 command_runner.go:130] >       "id":  "sha256:b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c",
	I1219 06:05:53.331309 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.331314 2059048 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20250512-df8de77b"
	I1219 06:05:53.331318 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331322 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.331332 2059048 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a"
	I1219 06:05:53.331336 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331340 2059048 command_runner.go:130] >       "size":  "40636774",
	I1219 06:05:53.331350 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.331355 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.331358 2059048 command_runner.go:130] >     },
	I1219 06:05:53.331361 2059048 command_runner.go:130] >     {
	I1219 06:05:53.331369 2059048 command_runner.go:130] >       "id":  "sha256:ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6",
	I1219 06:05:53.331373 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.331378 2059048 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1219 06:05:53.331381 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331385 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.331393 2059048 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"
	I1219 06:05:53.331396 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331400 2059048 command_runner.go:130] >       "size":  "8034419",
	I1219 06:05:53.331404 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.331408 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.331411 2059048 command_runner.go:130] >     },
	I1219 06:05:53.331414 2059048 command_runner.go:130] >     {
	I1219 06:05:53.331421 2059048 command_runner.go:130] >       "id":  "sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf",
	I1219 06:05:53.331425 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.331430 2059048 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.13.1"
	I1219 06:05:53.331433 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331439 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.331447 2059048 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6"
	I1219 06:05:53.331451 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331454 2059048 command_runner.go:130] >       "size":  "21168808",
	I1219 06:05:53.331458 2059048 command_runner.go:130] >       "username":  "nonroot",
	I1219 06:05:53.331462 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.331466 2059048 command_runner.go:130] >     },
	I1219 06:05:53.331468 2059048 command_runner.go:130] >     {
	I1219 06:05:53.331475 2059048 command_runner.go:130] >       "id":  "sha256:271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57",
	I1219 06:05:53.331479 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.331484 2059048 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.6-0"
	I1219 06:05:53.331487 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331491 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.331502 2059048 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890"
	I1219 06:05:53.331506 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331510 2059048 command_runner.go:130] >       "size":  "21749640",
	I1219 06:05:53.331515 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.331519 2059048 command_runner.go:130] >         "value":  "0"
	I1219 06:05:53.331522 2059048 command_runner.go:130] >       },
	I1219 06:05:53.331526 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.331530 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.331533 2059048 command_runner.go:130] >     },
	I1219 06:05:53.331536 2059048 command_runner.go:130] >     {
	I1219 06:05:53.331543 2059048 command_runner.go:130] >       "id":  "sha256:3c6ba27e07aef16adb050828695bfe6206439147b9ade2a2a1777c276bf79a54",
	I1219 06:05:53.331547 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.331551 2059048 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.35.0-rc.1"
	I1219 06:05:53.331555 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331559 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.331566 2059048 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:58367b5c0428495c0c12411fa7a018f5d40fe57307b85d8935b1ed35706ff7ee"
	I1219 06:05:53.331569 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331573 2059048 command_runner.go:130] >       "size":  "24692223",
	I1219 06:05:53.331577 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.331585 2059048 command_runner.go:130] >         "value":  "0"
	I1219 06:05:53.331592 2059048 command_runner.go:130] >       },
	I1219 06:05:53.331596 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.331600 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.331603 2059048 command_runner.go:130] >     },
	I1219 06:05:53.331606 2059048 command_runner.go:130] >     {
	I1219 06:05:53.331613 2059048 command_runner.go:130] >       "id":  "sha256:a34b3483f25ba81aa72f3aeb607a8c756479e8497d8420acbcd2854162ebf84a",
	I1219 06:05:53.331617 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.331622 2059048 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.35.0-rc.1"
	I1219 06:05:53.331626 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331629 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.331638 2059048 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:57ab0f75f58d99f4be7bff7bdda015fcbf1b7c20e58ba2722c8c39f751dc8c98"
	I1219 06:05:53.331641 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331645 2059048 command_runner.go:130] >       "size":  "20672157",
	I1219 06:05:53.331652 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.331656 2059048 command_runner.go:130] >         "value":  "0"
	I1219 06:05:53.331659 2059048 command_runner.go:130] >       },
	I1219 06:05:53.331663 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.331666 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.331669 2059048 command_runner.go:130] >     },
	I1219 06:05:53.331672 2059048 command_runner.go:130] >     {
	I1219 06:05:53.331679 2059048 command_runner.go:130] >       "id":  "sha256:7e3acea3d87aa7ca234514e7f9c10450c7a7f87fc273fc9b5a220e2a2be1ce4e",
	I1219 06:05:53.331683 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.331688 2059048 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.35.0-rc.1"
	I1219 06:05:53.331691 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331695 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.331702 2059048 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:bdd1fa8b53558a2e1967379a36b085c93faf15581e5fa9f212baf679d89c5bb5"
	I1219 06:05:53.331705 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331709 2059048 command_runner.go:130] >       "size":  "22432301",
	I1219 06:05:53.331713 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.331717 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.331720 2059048 command_runner.go:130] >     },
	I1219 06:05:53.331723 2059048 command_runner.go:130] >     {
	I1219 06:05:53.331733 2059048 command_runner.go:130] >       "id":  "sha256:abca4d5226620be2218c3971464a1066651a743008c1db8720353446a4b7bbde",
	I1219 06:05:53.331737 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.331742 2059048 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.35.0-rc.1"
	I1219 06:05:53.331745 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331749 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.331757 2059048 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:8155e3db27c7081abfc8eb5da70820cfeaf0bba7449e45360e8220e670f417d3"
	I1219 06:05:53.331760 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331764 2059048 command_runner.go:130] >       "size":  "15405535",
	I1219 06:05:53.331767 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.331771 2059048 command_runner.go:130] >         "value":  "0"
	I1219 06:05:53.331774 2059048 command_runner.go:130] >       },
	I1219 06:05:53.331778 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.331782 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.331785 2059048 command_runner.go:130] >     },
	I1219 06:05:53.331792 2059048 command_runner.go:130] >     {
	I1219 06:05:53.331799 2059048 command_runner.go:130] >       "id":  "sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1219 06:05:53.331803 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.331807 2059048 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1219 06:05:53.331811 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331815 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.331822 2059048 command_runner.go:130] >         "registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c"
	I1219 06:05:53.331826 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331829 2059048 command_runner.go:130] >       "size":  "267939",
	I1219 06:05:53.331833 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.331837 2059048 command_runner.go:130] >         "value":  "65535"
	I1219 06:05:53.331841 2059048 command_runner.go:130] >       },
	I1219 06:05:53.331845 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.331849 2059048 command_runner.go:130] >       "pinned":  true
	I1219 06:05:53.331852 2059048 command_runner.go:130] >     }
	I1219 06:05:53.331855 2059048 command_runner.go:130] >   ]
	I1219 06:05:53.331858 2059048 command_runner.go:130] > }
	I1219 06:05:53.333541 2059048 containerd.go:627] all images are preloaded for containerd runtime.
	I1219 06:05:53.333565 2059048 cache_images.go:86] Images are preloaded, skipping loading
	I1219 06:05:53.333574 2059048 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-rc.1 containerd true true} ...
	I1219 06:05:53.333694 2059048 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-rc.1/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-006924 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1219 06:05:53.333773 2059048 ssh_runner.go:195] Run: sudo crictl --timeout=10s info
	I1219 06:05:53.354890 2059048 command_runner.go:130] > {
	I1219 06:05:53.354909 2059048 command_runner.go:130] >   "cniconfig": {
	I1219 06:05:53.354915 2059048 command_runner.go:130] >     "Networks": [
	I1219 06:05:53.354919 2059048 command_runner.go:130] >       {
	I1219 06:05:53.354926 2059048 command_runner.go:130] >         "Config": {
	I1219 06:05:53.354932 2059048 command_runner.go:130] >           "CNIVersion": "0.3.1",
	I1219 06:05:53.354937 2059048 command_runner.go:130] >           "Name": "cni-loopback",
	I1219 06:05:53.354941 2059048 command_runner.go:130] >           "Plugins": [
	I1219 06:05:53.354945 2059048 command_runner.go:130] >             {
	I1219 06:05:53.354949 2059048 command_runner.go:130] >               "Network": {
	I1219 06:05:53.354953 2059048 command_runner.go:130] >                 "ipam": {},
	I1219 06:05:53.354958 2059048 command_runner.go:130] >                 "type": "loopback"
	I1219 06:05:53.354962 2059048 command_runner.go:130] >               },
	I1219 06:05:53.354967 2059048 command_runner.go:130] >               "Source": "{\"type\":\"loopback\"}"
	I1219 06:05:53.354971 2059048 command_runner.go:130] >             }
	I1219 06:05:53.354975 2059048 command_runner.go:130] >           ],
	I1219 06:05:53.354988 2059048 command_runner.go:130] >           "Source": "{\n\"cniVersion\": \"0.3.1\",\n\"name\": \"cni-loopback\",\n\"plugins\": [{\n  \"type\": \"loopback\"\n}]\n}"
	I1219 06:05:53.354992 2059048 command_runner.go:130] >         },
	I1219 06:05:53.354997 2059048 command_runner.go:130] >         "IFName": "lo"
	I1219 06:05:53.355000 2059048 command_runner.go:130] >       }
	I1219 06:05:53.355003 2059048 command_runner.go:130] >     ],
	I1219 06:05:53.355007 2059048 command_runner.go:130] >     "PluginConfDir": "/etc/cni/net.d",
	I1219 06:05:53.355011 2059048 command_runner.go:130] >     "PluginDirs": [
	I1219 06:05:53.355015 2059048 command_runner.go:130] >       "/opt/cni/bin"
	I1219 06:05:53.355027 2059048 command_runner.go:130] >     ],
	I1219 06:05:53.355031 2059048 command_runner.go:130] >     "PluginMaxConfNum": 1,
	I1219 06:05:53.355036 2059048 command_runner.go:130] >     "Prefix": "eth"
	I1219 06:05:53.355039 2059048 command_runner.go:130] >   },
	I1219 06:05:53.355042 2059048 command_runner.go:130] >   "config": {
	I1219 06:05:53.355046 2059048 command_runner.go:130] >     "cdiSpecDirs": [
	I1219 06:05:53.355050 2059048 command_runner.go:130] >       "/etc/cdi",
	I1219 06:05:53.355059 2059048 command_runner.go:130] >       "/var/run/cdi"
	I1219 06:05:53.355062 2059048 command_runner.go:130] >     ],
	I1219 06:05:53.355066 2059048 command_runner.go:130] >     "cni": {
	I1219 06:05:53.355070 2059048 command_runner.go:130] >       "binDir": "",
	I1219 06:05:53.355073 2059048 command_runner.go:130] >       "binDirs": [
	I1219 06:05:53.355077 2059048 command_runner.go:130] >         "/opt/cni/bin"
	I1219 06:05:53.355080 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.355084 2059048 command_runner.go:130] >       "confDir": "/etc/cni/net.d",
	I1219 06:05:53.355088 2059048 command_runner.go:130] >       "confTemplate": "",
	I1219 06:05:53.355091 2059048 command_runner.go:130] >       "ipPref": "",
	I1219 06:05:53.355095 2059048 command_runner.go:130] >       "maxConfNum": 1,
	I1219 06:05:53.355099 2059048 command_runner.go:130] >       "setupSerially": false,
	I1219 06:05:53.355103 2059048 command_runner.go:130] >       "useInternalLoopback": false
	I1219 06:05:53.355106 2059048 command_runner.go:130] >     },
	I1219 06:05:53.355114 2059048 command_runner.go:130] >     "containerd": {
	I1219 06:05:53.355119 2059048 command_runner.go:130] >       "defaultRuntimeName": "runc",
	I1219 06:05:53.355123 2059048 command_runner.go:130] >       "ignoreBlockIONotEnabledErrors": false,
	I1219 06:05:53.355128 2059048 command_runner.go:130] >       "ignoreRdtNotEnabledErrors": false,
	I1219 06:05:53.355132 2059048 command_runner.go:130] >       "runtimes": {
	I1219 06:05:53.355136 2059048 command_runner.go:130] >         "runc": {
	I1219 06:05:53.355140 2059048 command_runner.go:130] >           "ContainerAnnotations": null,
	I1219 06:05:53.355145 2059048 command_runner.go:130] >           "PodAnnotations": null,
	I1219 06:05:53.355151 2059048 command_runner.go:130] >           "baseRuntimeSpec": "",
	I1219 06:05:53.355155 2059048 command_runner.go:130] >           "cgroupWritable": false,
	I1219 06:05:53.355159 2059048 command_runner.go:130] >           "cniConfDir": "",
	I1219 06:05:53.355163 2059048 command_runner.go:130] >           "cniMaxConfNum": 0,
	I1219 06:05:53.355167 2059048 command_runner.go:130] >           "io_type": "",
	I1219 06:05:53.355171 2059048 command_runner.go:130] >           "options": {
	I1219 06:05:53.355174 2059048 command_runner.go:130] >             "BinaryName": "",
	I1219 06:05:53.355179 2059048 command_runner.go:130] >             "CriuImagePath": "",
	I1219 06:05:53.355183 2059048 command_runner.go:130] >             "CriuWorkPath": "",
	I1219 06:05:53.355187 2059048 command_runner.go:130] >             "IoGid": 0,
	I1219 06:05:53.355190 2059048 command_runner.go:130] >             "IoUid": 0,
	I1219 06:05:53.355198 2059048 command_runner.go:130] >             "NoNewKeyring": false,
	I1219 06:05:53.355201 2059048 command_runner.go:130] >             "Root": "",
	I1219 06:05:53.355205 2059048 command_runner.go:130] >             "ShimCgroup": "",
	I1219 06:05:53.355210 2059048 command_runner.go:130] >             "SystemdCgroup": false
	I1219 06:05:53.355214 2059048 command_runner.go:130] >           },
	I1219 06:05:53.355219 2059048 command_runner.go:130] >           "privileged_without_host_devices": false,
	I1219 06:05:53.355225 2059048 command_runner.go:130] >           "privileged_without_host_devices_all_devices_allowed": false,
	I1219 06:05:53.355229 2059048 command_runner.go:130] >           "runtimePath": "",
	I1219 06:05:53.355233 2059048 command_runner.go:130] >           "runtimeType": "io.containerd.runc.v2",
	I1219 06:05:53.355238 2059048 command_runner.go:130] >           "sandboxer": "podsandbox",
	I1219 06:05:53.355242 2059048 command_runner.go:130] >           "snapshotter": ""
	I1219 06:05:53.355245 2059048 command_runner.go:130] >         }
	I1219 06:05:53.355248 2059048 command_runner.go:130] >       }
	I1219 06:05:53.355252 2059048 command_runner.go:130] >     },
	I1219 06:05:53.355262 2059048 command_runner.go:130] >     "containerdEndpoint": "/run/containerd/containerd.sock",
	I1219 06:05:53.355267 2059048 command_runner.go:130] >     "containerdRootDir": "/var/lib/containerd",
	I1219 06:05:53.355273 2059048 command_runner.go:130] >     "device_ownership_from_security_context": false,
	I1219 06:05:53.355277 2059048 command_runner.go:130] >     "disableApparmor": false,
	I1219 06:05:53.355282 2059048 command_runner.go:130] >     "disableHugetlbController": true,
	I1219 06:05:53.355286 2059048 command_runner.go:130] >     "disableProcMount": false,
	I1219 06:05:53.355290 2059048 command_runner.go:130] >     "drainExecSyncIOTimeout": "0s",
	I1219 06:05:53.355294 2059048 command_runner.go:130] >     "enableCDI": true,
	I1219 06:05:53.355298 2059048 command_runner.go:130] >     "enableSelinux": false,
	I1219 06:05:53.355302 2059048 command_runner.go:130] >     "enableUnprivilegedICMP": true,
	I1219 06:05:53.355306 2059048 command_runner.go:130] >     "enableUnprivilegedPorts": true,
	I1219 06:05:53.355311 2059048 command_runner.go:130] >     "ignoreDeprecationWarnings": null,
	I1219 06:05:53.355319 2059048 command_runner.go:130] >     "ignoreImageDefinedVolumes": false,
	I1219 06:05:53.355323 2059048 command_runner.go:130] >     "maxContainerLogLineSize": 16384,
	I1219 06:05:53.355328 2059048 command_runner.go:130] >     "netnsMountsUnderStateDir": false,
	I1219 06:05:53.355332 2059048 command_runner.go:130] >     "restrictOOMScoreAdj": false,
	I1219 06:05:53.355338 2059048 command_runner.go:130] >     "rootDir": "/var/lib/containerd/io.containerd.grpc.v1.cri",
	I1219 06:05:53.355342 2059048 command_runner.go:130] >     "selinuxCategoryRange": 1024,
	I1219 06:05:53.355347 2059048 command_runner.go:130] >     "stateDir": "/run/containerd/io.containerd.grpc.v1.cri",
	I1219 06:05:53.355357 2059048 command_runner.go:130] >     "tolerateMissingHugetlbController": true,
	I1219 06:05:53.355362 2059048 command_runner.go:130] >     "unsetSeccompProfile": ""
	I1219 06:05:53.355365 2059048 command_runner.go:130] >   },
	I1219 06:05:53.355369 2059048 command_runner.go:130] >   "features": {
	I1219 06:05:53.355373 2059048 command_runner.go:130] >     "supplemental_groups_policy": true
	I1219 06:05:53.355376 2059048 command_runner.go:130] >   },
	I1219 06:05:53.355379 2059048 command_runner.go:130] >   "golang": "go1.24.9",
	I1219 06:05:53.355389 2059048 command_runner.go:130] >   "lastCNILoadStatus": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1219 06:05:53.355399 2059048 command_runner.go:130] >   "lastCNILoadStatus.default": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1219 06:05:53.355402 2059048 command_runner.go:130] >   "runtimeHandlers": [
	I1219 06:05:53.355406 2059048 command_runner.go:130] >     {
	I1219 06:05:53.355409 2059048 command_runner.go:130] >       "features": {
	I1219 06:05:53.355414 2059048 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1219 06:05:53.355418 2059048 command_runner.go:130] >         "user_namespaces": true
	I1219 06:05:53.355421 2059048 command_runner.go:130] >       }
	I1219 06:05:53.355424 2059048 command_runner.go:130] >     },
	I1219 06:05:53.355427 2059048 command_runner.go:130] >     {
	I1219 06:05:53.355431 2059048 command_runner.go:130] >       "features": {
	I1219 06:05:53.355436 2059048 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1219 06:05:53.355440 2059048 command_runner.go:130] >         "user_namespaces": true
	I1219 06:05:53.355443 2059048 command_runner.go:130] >       },
	I1219 06:05:53.355447 2059048 command_runner.go:130] >       "name": "runc"
	I1219 06:05:53.355449 2059048 command_runner.go:130] >     }
	I1219 06:05:53.355452 2059048 command_runner.go:130] >   ],
	I1219 06:05:53.355456 2059048 command_runner.go:130] >   "status": {
	I1219 06:05:53.355460 2059048 command_runner.go:130] >     "conditions": [
	I1219 06:05:53.355463 2059048 command_runner.go:130] >       {
	I1219 06:05:53.355467 2059048 command_runner.go:130] >         "message": "",
	I1219 06:05:53.355471 2059048 command_runner.go:130] >         "reason": "",
	I1219 06:05:53.355475 2059048 command_runner.go:130] >         "status": true,
	I1219 06:05:53.355480 2059048 command_runner.go:130] >         "type": "RuntimeReady"
	I1219 06:05:53.355483 2059048 command_runner.go:130] >       },
	I1219 06:05:53.355486 2059048 command_runner.go:130] >       {
	I1219 06:05:53.355495 2059048 command_runner.go:130] >         "message": "Network plugin returns error: cni plugin not initialized",
	I1219 06:05:53.355500 2059048 command_runner.go:130] >         "reason": "NetworkPluginNotReady",
	I1219 06:05:53.355504 2059048 command_runner.go:130] >         "status": false,
	I1219 06:05:53.355508 2059048 command_runner.go:130] >         "type": "NetworkReady"
	I1219 06:05:53.355512 2059048 command_runner.go:130] >       },
	I1219 06:05:53.355515 2059048 command_runner.go:130] >       {
	I1219 06:05:53.355536 2059048 command_runner.go:130] >         "message": "{\"io.containerd.deprecation/cgroup-v1\":\"The support for cgroup v1 is deprecated since containerd v2.2 and will be removed by no later than May 2029. Upgrade the host to use cgroup v2.\"}",
	I1219 06:05:53.355541 2059048 command_runner.go:130] >         "reason": "ContainerdHasDeprecationWarnings",
	I1219 06:05:53.355547 2059048 command_runner.go:130] >         "status": false,
	I1219 06:05:53.355552 2059048 command_runner.go:130] >         "type": "ContainerdHasNoDeprecationWarnings"
	I1219 06:05:53.355555 2059048 command_runner.go:130] >       }
	I1219 06:05:53.355557 2059048 command_runner.go:130] >     ]
	I1219 06:05:53.355560 2059048 command_runner.go:130] >   }
	I1219 06:05:53.355563 2059048 command_runner.go:130] > }
	I1219 06:05:53.357747 2059048 cni.go:84] Creating CNI manager for ""
	I1219 06:05:53.357770 2059048 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1219 06:05:53.357795 2059048 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1219 06:05:53.357824 2059048 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-rc.1 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-006924 NodeName:functional-006924 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt Stati
cPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1219 06:05:53.357938 2059048 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-006924"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-rc.1
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1219 06:05:53.358021 2059048 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-rc.1
	I1219 06:05:53.365051 2059048 command_runner.go:130] > kubeadm
	I1219 06:05:53.365070 2059048 command_runner.go:130] > kubectl
	I1219 06:05:53.365074 2059048 command_runner.go:130] > kubelet
	I1219 06:05:53.366033 2059048 binaries.go:51] Found k8s binaries, skipping transfer
	I1219 06:05:53.366118 2059048 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1219 06:05:53.373810 2059048 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (326 bytes)
	I1219 06:05:53.386231 2059048 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (357 bytes)
	I1219 06:05:53.399156 2059048 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2235 bytes)
	I1219 06:05:53.411832 2059048 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1219 06:05:53.415476 2059048 command_runner.go:130] > 192.168.49.2	control-plane.minikube.internal
	I1219 06:05:53.415580 2059048 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1219 06:05:53.524736 2059048 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1219 06:05:53.900522 2059048 certs.go:69] Setting up /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924 for IP: 192.168.49.2
	I1219 06:05:53.900547 2059048 certs.go:195] generating shared ca certs ...
	I1219 06:05:53.900563 2059048 certs.go:227] acquiring lock for ca certs: {Name:mk382c71693ea4061363f97b153b21bf6cdf5f38 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1219 06:05:53.900702 2059048 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.key
	I1219 06:05:53.900780 2059048 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/proxy-client-ca.key
	I1219 06:05:53.900803 2059048 certs.go:257] generating profile certs ...
	I1219 06:05:53.900908 2059048 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.key
	I1219 06:05:53.900976 2059048 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.key.febe6fed
	I1219 06:05:53.901024 2059048 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/proxy-client.key
	I1219 06:05:53.901037 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I1219 06:05:53.901081 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I1219 06:05:53.901098 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I1219 06:05:53.901109 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I1219 06:05:53.901127 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I1219 06:05:53.901139 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I1219 06:05:53.901154 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I1219 06:05:53.901171 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I1219 06:05:53.901229 2059048 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/2000386.pem (1338 bytes)
	W1219 06:05:53.901264 2059048 certs.go:480] ignoring /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/2000386_empty.pem, impossibly tiny 0 bytes
	I1219 06:05:53.901277 2059048 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca-key.pem (1679 bytes)
	I1219 06:05:53.901306 2059048 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem (1078 bytes)
	I1219 06:05:53.901333 2059048 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/cert.pem (1123 bytes)
	I1219 06:05:53.901365 2059048 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/key.pem (1671 bytes)
	I1219 06:05:53.901418 2059048 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem (1708 bytes)
	I1219 06:05:53.901449 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:05:53.901465 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/2000386.pem -> /usr/share/ca-certificates/2000386.pem
	I1219 06:05:53.901481 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem -> /usr/share/ca-certificates/20003862.pem
	I1219 06:05:53.902039 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1219 06:05:53.926748 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1219 06:05:53.945718 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1219 06:05:53.964111 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I1219 06:05:53.984388 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1219 06:05:54.005796 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I1219 06:05:54.027058 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1219 06:05:54.045330 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I1219 06:05:54.062681 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1219 06:05:54.080390 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/2000386.pem --> /usr/share/ca-certificates/2000386.pem (1338 bytes)
	I1219 06:05:54.102399 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem --> /usr/share/ca-certificates/20003862.pem (1708 bytes)
	I1219 06:05:54.120580 2059048 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (722 bytes)
	I1219 06:05:54.133732 2059048 ssh_runner.go:195] Run: openssl version
	I1219 06:05:54.139799 2059048 command_runner.go:130] > OpenSSL 3.0.17 1 Jul 2025 (Library: OpenSSL 3.0.17 1 Jul 2025)
	I1219 06:05:54.140191 2059048 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/20003862.pem
	I1219 06:05:54.147812 2059048 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/20003862.pem /etc/ssl/certs/20003862.pem
	I1219 06:05:54.155315 2059048 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/20003862.pem
	I1219 06:05:54.159037 2059048 command_runner.go:130] > -rw-r--r-- 1 root root 1708 Dec 19 05:57 /usr/share/ca-certificates/20003862.pem
	I1219 06:05:54.159108 2059048 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 19 05:57 /usr/share/ca-certificates/20003862.pem
	I1219 06:05:54.159165 2059048 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/20003862.pem
	I1219 06:05:54.200029 2059048 command_runner.go:130] > 3ec20f2e
	I1219 06:05:54.200546 2059048 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1219 06:05:54.208733 2059048 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:05:54.216254 2059048 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1219 06:05:54.224240 2059048 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:05:54.228059 2059048 command_runner.go:130] > -rw-r--r-- 1 root root 1111 Dec 19 05:43 /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:05:54.228165 2059048 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 19 05:43 /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:05:54.228244 2059048 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:05:54.268794 2059048 command_runner.go:130] > b5213941
	I1219 06:05:54.269372 2059048 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1219 06:05:54.277054 2059048 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/2000386.pem
	I1219 06:05:54.284467 2059048 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/2000386.pem /etc/ssl/certs/2000386.pem
	I1219 06:05:54.291949 2059048 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/2000386.pem
	I1219 06:05:54.295750 2059048 command_runner.go:130] > -rw-r--r-- 1 root root 1338 Dec 19 05:57 /usr/share/ca-certificates/2000386.pem
	I1219 06:05:54.295798 2059048 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 19 05:57 /usr/share/ca-certificates/2000386.pem
	I1219 06:05:54.295849 2059048 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/2000386.pem
	I1219 06:05:54.341163 2059048 command_runner.go:130] > 51391683
	I1219 06:05:54.341782 2059048 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1219 06:05:54.349497 2059048 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1219 06:05:54.353229 2059048 command_runner.go:130] >   File: /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1219 06:05:54.353253 2059048 command_runner.go:130] >   Size: 1176      	Blocks: 8          IO Block: 4096   regular file
	I1219 06:05:54.353261 2059048 command_runner.go:130] > Device: 259,1	Inode: 1582667     Links: 1
	I1219 06:05:54.353268 2059048 command_runner.go:130] > Access: (0644/-rw-r--r--)  Uid: (    0/    root)   Gid: (    0/    root)
	I1219 06:05:54.353275 2059048 command_runner.go:130] > Access: 2025-12-19 06:01:47.245300782 +0000
	I1219 06:05:54.353281 2059048 command_runner.go:130] > Modify: 2025-12-19 05:57:42.198721757 +0000
	I1219 06:05:54.353286 2059048 command_runner.go:130] > Change: 2025-12-19 05:57:42.198721757 +0000
	I1219 06:05:54.353294 2059048 command_runner.go:130] >  Birth: 2025-12-19 05:57:42.198721757 +0000
	I1219 06:05:54.353372 2059048 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1219 06:05:54.398897 2059048 command_runner.go:130] > Certificate will not expire
	I1219 06:05:54.399374 2059048 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1219 06:05:54.440111 2059048 command_runner.go:130] > Certificate will not expire
	I1219 06:05:54.440565 2059048 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1219 06:05:54.481409 2059048 command_runner.go:130] > Certificate will not expire
	I1219 06:05:54.481968 2059048 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1219 06:05:54.522576 2059048 command_runner.go:130] > Certificate will not expire
	I1219 06:05:54.523020 2059048 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1219 06:05:54.563365 2059048 command_runner.go:130] > Certificate will not expire
	I1219 06:05:54.563892 2059048 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1219 06:05:54.604428 2059048 command_runner.go:130] > Certificate will not expire
	I1219 06:05:54.604920 2059048 kubeadm.go:401] StartCluster: {Name:functional-006924 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APIServerName:minikubeCA A
PIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false Cust
omQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1219 06:05:54.605002 2059048 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1219 06:05:54.605063 2059048 ssh_runner.go:195] Run: sudo -s eval "crictl --timeout=10s ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1219 06:05:54.631433 2059048 cri.go:92] found id: ""
	I1219 06:05:54.631512 2059048 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1219 06:05:54.638289 2059048 command_runner.go:130] > /var/lib/kubelet/config.yaml
	I1219 06:05:54.638353 2059048 command_runner.go:130] > /var/lib/kubelet/kubeadm-flags.env
	I1219 06:05:54.638374 2059048 command_runner.go:130] > /var/lib/minikube/etcd:
	I1219 06:05:54.639191 2059048 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1219 06:05:54.639207 2059048 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1219 06:05:54.639278 2059048 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1219 06:05:54.646289 2059048 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1219 06:05:54.646704 2059048 kubeconfig.go:47] verify endpoint returned: get endpoint: "functional-006924" does not appear in /home/jenkins/minikube-integration/22230-1998525/kubeconfig
	I1219 06:05:54.646809 2059048 kubeconfig.go:62] /home/jenkins/minikube-integration/22230-1998525/kubeconfig needs updating (will repair): [kubeconfig missing "functional-006924" cluster setting kubeconfig missing "functional-006924" context setting]
	I1219 06:05:54.647118 2059048 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22230-1998525/kubeconfig: {Name:mk7db1732c7d76f01100426cb283dc7515a3b9ea Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1219 06:05:54.647542 2059048 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/22230-1998525/kubeconfig
	I1219 06:05:54.647700 2059048 kapi.go:59] client config for functional-006924: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.crt", KeyFile:"/home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.key", CAFile:"/home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(ni
l), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1ffe230), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1219 06:05:54.648289 2059048 envvar.go:172] "Feature gate default state" feature="ClientsPreferCBOR" enabled=false
	I1219 06:05:54.648312 2059048 envvar.go:172] "Feature gate default state" feature="InOrderInformers" enabled=true
	I1219 06:05:54.648318 2059048 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I1219 06:05:54.648377 2059048 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I1219 06:05:54.648389 2059048 envvar.go:172] "Feature gate default state" feature="ClientsAllowCBOR" enabled=false
	I1219 06:05:54.648357 2059048 cert_rotation.go:141] "Starting client certificate rotation controller" logger="tls-transport-cache"
	I1219 06:05:54.648779 2059048 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1219 06:05:54.659696 2059048 kubeadm.go:635] The running cluster does not require reconfiguration: 192.168.49.2
	I1219 06:05:54.659739 2059048 kubeadm.go:602] duration metric: took 20.517186ms to restartPrimaryControlPlane
	I1219 06:05:54.659750 2059048 kubeadm.go:403] duration metric: took 54.838405ms to StartCluster
	I1219 06:05:54.659766 2059048 settings.go:142] acquiring lock: {Name:mk0fb518a1861caea9ce90c087e9f98ff93c6842 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1219 06:05:54.659859 2059048 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/22230-1998525/kubeconfig
	I1219 06:05:54.660602 2059048 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22230-1998525/kubeconfig: {Name:mk7db1732c7d76f01100426cb283dc7515a3b9ea Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1219 06:05:54.660878 2059048 start.go:236] Will wait 6m0s for node &{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I1219 06:05:54.661080 2059048 config.go:182] Loaded profile config "functional-006924": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1219 06:05:54.661197 2059048 addons.go:543] enable addons start: toEnable=map[ambassador:false amd-gpu-device-plugin:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I1219 06:05:54.661465 2059048 addons.go:70] Setting storage-provisioner=true in profile "functional-006924"
	I1219 06:05:54.661481 2059048 addons.go:239] Setting addon storage-provisioner=true in "functional-006924"
	I1219 06:05:54.661506 2059048 host.go:66] Checking if "functional-006924" exists ...
	I1219 06:05:54.661954 2059048 cli_runner.go:164] Run: docker container inspect functional-006924 --format={{.State.Status}}
	I1219 06:05:54.662128 2059048 addons.go:70] Setting default-storageclass=true in profile "functional-006924"
	I1219 06:05:54.662158 2059048 addons_storage_classes.go:34] enableOrDisableStorageClasses default-storageclass=true on "functional-006924"
	I1219 06:05:54.662427 2059048 cli_runner.go:164] Run: docker container inspect functional-006924 --format={{.State.Status}}
	I1219 06:05:54.667300 2059048 out.go:179] * Verifying Kubernetes components...
	I1219 06:05:54.673650 2059048 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1219 06:05:54.689683 2059048 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/22230-1998525/kubeconfig
	I1219 06:05:54.689848 2059048 kapi.go:59] client config for functional-006924: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.crt", KeyFile:"/home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.key", CAFile:"/home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(ni
l), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1ffe230), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1219 06:05:54.690123 2059048 addons.go:239] Setting addon default-storageclass=true in "functional-006924"
	I1219 06:05:54.690152 2059048 host.go:66] Checking if "functional-006924" exists ...
	I1219 06:05:54.690560 2059048 cli_runner.go:164] Run: docker container inspect functional-006924 --format={{.State.Status}}
	I1219 06:05:54.715008 2059048 out.go:179]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I1219 06:05:54.717850 2059048 addons.go:436] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:05:54.717879 2059048 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I1219 06:05:54.717946 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:54.734767 2059048 addons.go:436] installing /etc/kubernetes/addons/storageclass.yaml
	I1219 06:05:54.734788 2059048 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I1219 06:05:54.734856 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:54.764236 2059048 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:05:54.773070 2059048 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:05:54.876977 2059048 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1219 06:05:54.898675 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:05:54.923995 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:05:55.652544 2059048 node_ready.go:35] waiting up to 6m0s for node "functional-006924" to be "Ready" ...
	I1219 06:05:55.652680 2059048 type.go:165] "Request Body" body=""
	I1219 06:05:55.652777 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:05:55.653088 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:55.653133 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:55.653174 2059048 retry.go:31] will retry after 152.748ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:55.653242 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:55.653274 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:55.653290 2059048 retry.go:31] will retry after 222.401366ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:55.653368 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:05:55.806850 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:05:55.871164 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:55.871241 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:55.871268 2059048 retry.go:31] will retry after 248.166368ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:55.876351 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:05:55.932419 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:55.936105 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:55.936137 2059048 retry.go:31] will retry after 191.546131ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:56.120512 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:05:56.128049 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:05:56.153420 2059048 type.go:165] "Request Body" body=""
	I1219 06:05:56.153544 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:05:56.153844 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:05:56.188805 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:56.192400 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:56.192475 2059048 retry.go:31] will retry after 421.141509ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:56.203130 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:56.203228 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:56.203252 2059048 retry.go:31] will retry after 495.708783ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:56.614800 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:05:56.653236 2059048 type.go:165] "Request Body" body=""
	I1219 06:05:56.653361 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:05:56.653708 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:05:56.677894 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:56.677943 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:56.677993 2059048 retry.go:31] will retry after 980.857907ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:56.700099 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:05:56.755124 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:56.758623 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:56.758652 2059048 retry.go:31] will retry after 1.143622688s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:57.152911 2059048 type.go:165] "Request Body" body=""
	I1219 06:05:57.153042 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:05:57.153399 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:05:57.652868 2059048 type.go:165] "Request Body" body=""
	I1219 06:05:57.652947 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:05:57.653291 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:05:57.653378 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:05:57.659518 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:05:57.724667 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:57.724716 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:57.724735 2059048 retry.go:31] will retry after 900.329628ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:57.903067 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:05:57.986230 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:57.986314 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:57.986340 2059048 retry.go:31] will retry after 1.7845791s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:58.153671 2059048 type.go:165] "Request Body" body=""
	I1219 06:05:58.153749 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:05:58.154120 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:05:58.625732 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:05:58.653113 2059048 type.go:165] "Request Body" body=""
	I1219 06:05:58.653187 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:05:58.653475 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:05:58.712944 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:58.713042 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:58.713071 2059048 retry.go:31] will retry after 2.322946675s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:59.153740 2059048 type.go:165] "Request Body" body=""
	I1219 06:05:59.153822 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:05:59.154186 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:05:59.652843 2059048 type.go:165] "Request Body" body=""
	I1219 06:05:59.652927 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:05:59.653311 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:05:59.771577 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:05:59.835749 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:59.839442 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:59.839476 2059048 retry.go:31] will retry after 2.412907222s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:00.152821 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:00.152949 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:00.153306 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:00.153393 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:00.653320 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:00.653404 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:00.653734 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:01.036322 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:06:01.102362 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:01.106179 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:01.106214 2059048 retry.go:31] will retry after 2.139899672s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:01.153490 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:01.153572 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:01.153855 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:01.653656 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:01.653732 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:01.654026 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:02.152793 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:02.152870 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:02.153204 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:02.252582 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:06:02.312437 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:02.312479 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:02.312500 2059048 retry.go:31] will retry after 1.566668648s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:02.652881 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:02.652958 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:02.653230 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:02.653283 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:03.152957 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:03.153054 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:03.153393 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:03.246844 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:06:03.302237 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:03.305728 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:03.305771 2059048 retry.go:31] will retry after 6.170177016s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:03.653408 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:03.653482 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:03.653834 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:03.880237 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:06:03.939688 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:03.939736 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:03.939756 2059048 retry.go:31] will retry after 4.919693289s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:04.153025 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:04.153101 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:04.153368 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:04.653333 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:04.653405 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:04.653716 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:04.653762 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:05.153589 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:05.153680 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:05.154012 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:05.652866 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:05.652946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:05.653271 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:06.152846 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:06.152922 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:06.153277 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:06.652913 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:06.652987 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:06.653350 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:07.152875 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:07.152947 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:07.153248 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:07.153305 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:07.652846 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:07.652924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:07.653261 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:08.152864 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:08.152938 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:08.153236 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:08.652856 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:08.652924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:08.653179 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:08.859603 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:06:08.923746 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:08.923802 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:08.923824 2059048 retry.go:31] will retry after 7.49455239s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:09.153273 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:09.153361 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:09.153733 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:09.153794 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:09.476166 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:06:09.536340 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:09.536378 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:09.536397 2059048 retry.go:31] will retry after 3.264542795s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:09.652787 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:09.652863 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:09.653191 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:10.152879 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:10.152955 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:10.153217 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:10.653092 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:10.653172 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:10.653505 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:11.153189 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:11.153267 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:11.153564 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:11.653360 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:11.653432 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:11.653748 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:11.653809 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:12.153584 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:12.153667 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:12.154066 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:12.652816 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:12.652897 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:12.653232 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:12.801732 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:06:12.858668 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:12.858722 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:12.858742 2059048 retry.go:31] will retry after 7.015856992s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:13.152872 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:13.152946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:13.153207 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:13.652838 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:13.652915 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:13.653206 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:14.152876 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:14.152957 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:14.153277 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:14.153340 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:14.653224 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:14.653299 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:14.653566 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:15.153381 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:15.153458 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:15.153856 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:15.653715 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:15.653796 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:15.654137 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:16.153469 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:16.153543 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:16.153826 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:16.153868 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:16.419404 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:06:16.476671 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:16.480081 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:16.480119 2059048 retry.go:31] will retry after 7.9937579s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:16.653575 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:16.653716 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:16.653985 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:17.153751 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:17.153850 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:17.154134 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:17.652872 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:17.652939 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:17.653201 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:18.152893 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:18.152976 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:18.153301 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:18.652860 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:18.652932 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:18.653233 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:18.653289 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:19.152869 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:19.152958 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:19.153227 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:19.652934 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:19.653010 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:19.653354 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:19.875781 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:06:19.950537 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:19.954067 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:19.954097 2059048 retry.go:31] will retry after 12.496952157s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:20.153579 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:20.153656 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:20.154027 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:20.652751 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:20.652846 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:20.653178 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:21.153037 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:21.153112 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:21.153446 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:21.153504 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:21.652852 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:21.652937 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:21.653238 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:22.152882 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:22.152951 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:22.153207 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:22.652851 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:22.652925 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:22.653261 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:23.152820 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:23.152899 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:23.153240 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:23.652818 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:23.652892 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:23.653158 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:23.653200 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:24.152783 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:24.152857 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:24.153198 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:24.474774 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:06:24.538538 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:24.538585 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:24.538605 2059048 retry.go:31] will retry after 14.635173495s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:24.653139 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:24.653215 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:24.653538 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:25.153284 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:25.153354 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:25.153661 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:25.653607 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:25.653689 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:25.653986 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:25.654040 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:26.152728 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:26.152857 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:26.153186 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:26.652777 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:26.652852 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:26.653175 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:27.152839 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:27.152916 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:27.153210 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:27.652846 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:27.652930 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:27.653284 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:28.152874 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:28.152956 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:28.153232 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:28.153286 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:28.652849 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:28.652924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:28.653242 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:29.152838 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:29.152930 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:29.153299 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:29.652870 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:29.652945 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:29.653242 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:30.152853 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:30.152937 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:30.153290 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:30.153348 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:30.653022 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:30.653115 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:30.653416 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:31.152873 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:31.152960 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:31.153275 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:31.652985 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:31.653057 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:31.653405 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:32.152845 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:32.152921 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:32.153253 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:32.451758 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:06:32.506473 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:32.509966 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:32.509998 2059048 retry.go:31] will retry after 31.028140902s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:32.653234 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:32.653309 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:32.653632 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:32.653749 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:33.153497 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:33.153583 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:33.153949 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:33.652732 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:33.652832 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:33.653182 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:34.152891 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:34.152964 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:34.153263 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:34.653098 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:34.653173 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:34.653525 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:35.153363 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:35.153489 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:35.153845 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:35.153907 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:35.653568 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:35.653649 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:35.653928 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:36.153725 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:36.153800 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:36.154115 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:36.652850 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:36.652933 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:36.653274 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:37.153419 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:37.153492 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:37.153866 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:37.153952 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:37.653726 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:37.653797 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:37.654143 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:38.152858 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:38.152933 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:38.153263 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:38.652866 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:38.652935 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:38.653195 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:39.152856 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:39.152932 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:39.153265 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:39.174643 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:06:39.239291 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:39.239335 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:39.239354 2059048 retry.go:31] will retry after 15.420333699s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:39.652870 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:39.652944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:39.653261 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:39.653316 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:40.152873 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:40.152963 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:40.153285 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:40.653056 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:40.653131 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:40.653494 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:41.153188 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:41.153263 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:41.153588 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:41.652865 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:41.652932 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:41.653248 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:42.152875 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:42.152964 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:42.153379 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:42.153461 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:42.652919 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:42.653000 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:42.653314 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:43.152873 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:43.152946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:43.153240 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:43.652956 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:43.653027 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:43.653379 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:44.152963 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:44.153044 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:44.153381 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:44.653201 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:44.653284 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:44.653550 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:44.653592 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:45.153794 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:45.153882 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:45.154325 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:45.653107 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:45.653190 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:45.653497 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:46.152858 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:46.152925 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:46.153184 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:46.652851 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:46.652926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:46.653256 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:47.152863 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:47.152944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:47.153246 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:47.153293 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:47.652875 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:47.652964 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:47.653331 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:48.152820 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:48.152904 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:48.153254 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:48.652976 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:48.653054 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:48.653401 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:49.152928 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:49.153003 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:49.153264 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:49.652842 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:49.652922 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:49.653266 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:49.653325 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:50.152835 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:50.152916 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:50.153230 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:50.653021 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:50.653097 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:50.653360 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:51.152819 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:51.152892 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:51.153216 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:51.652938 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:51.653021 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:51.653350 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:51.653404 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:52.152878 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:52.152997 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:52.153340 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:52.653054 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:52.653126 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:52.653428 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:53.152809 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:53.152885 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:53.153202 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:53.652872 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:53.652946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:53.653212 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:54.152921 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:54.153000 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:54.153307 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:54.153361 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:54.653430 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:54.653504 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:54.653886 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:54.660097 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:06:54.724740 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:54.724806 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:54.724824 2059048 retry.go:31] will retry after 21.489743806s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:55.153047 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:55.153170 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:55.153542 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:55.653137 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:55.653210 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:55.653500 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:56.153216 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:56.153285 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:56.153620 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:56.153682 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:56.653420 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:56.653501 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:56.653832 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:57.153605 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:57.153702 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:57.154020 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:57.652746 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:57.652846 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:57.653184 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:58.152882 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:58.152958 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:58.153278 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:58.652843 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:58.652924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:58.653216 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:58.653262 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:59.152798 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:59.152874 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:59.153218 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:59.652867 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:59.652934 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:59.653193 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:00.155125 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:00.155210 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:00.156183 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:00.653343 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:00.653416 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:00.653737 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:00.653787 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:01.152970 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:01.153062 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:01.153389 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:01.652835 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:01.652908 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:01.653245 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:02.152861 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:02.152952 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:02.153330 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:02.652900 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:02.652986 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:02.653368 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:03.152826 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:03.152908 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:03.153251 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:03.153306 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:03.538820 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:07:03.598261 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:07:03.602187 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:07:03.602221 2059048 retry.go:31] will retry after 27.693032791s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:07:03.653410 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:03.653486 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:03.653840 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:04.153298 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:04.153371 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:04.153670 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:04.653539 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:04.653618 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:04.653956 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:05.153749 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:05.153837 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:05.154160 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:05.154219 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:05.653149 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:05.653217 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:05.653546 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:06.153378 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:06.153468 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:06.153799 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:06.653420 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:06.653494 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:06.653803 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:07.153113 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:07.153187 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:07.153451 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:07.652897 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:07.652979 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:07.653292 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:07.653351 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:08.152825 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:08.152901 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:08.153274 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:08.652859 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:08.652926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:08.653238 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:09.153667 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:09.153756 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:09.154076 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:09.652824 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:09.652899 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:09.653232 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:10.153341 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:10.153410 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:10.153757 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:10.153818 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:10.653710 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:10.653802 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:10.654164 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:11.152777 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:11.152862 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:11.153199 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:11.652877 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:11.652946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:11.653219 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:12.152831 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:12.152908 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:12.153224 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:12.652832 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:12.652911 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:12.653226 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:12.653273 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:13.152877 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:13.152951 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:13.153279 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:13.652822 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:13.652904 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:13.653241 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:14.152814 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:14.152897 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:14.153218 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:14.653177 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:14.653250 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:14.653558 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:14.653611 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:15.153356 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:15.153436 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:15.153788 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:15.652725 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:15.652816 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:15.653161 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:16.152875 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:16.152948 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:16.153206 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:16.215537 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:07:16.273841 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:07:16.273881 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:07:16.273899 2059048 retry.go:31] will retry after 30.872906877s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:07:16.653514 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:16.653598 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:16.653919 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:16.653970 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:17.153579 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:17.153656 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:17.153994 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:17.653597 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:17.653665 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:17.653945 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:18.152782 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:18.152859 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:18.153155 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:18.652836 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:18.652926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:18.653269 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:19.152861 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:19.152939 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:19.153239 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:19.153292 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:19.652841 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:19.652916 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:19.653250 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:20.152836 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:20.152910 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:20.153258 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:20.653266 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:20.653354 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:20.653711 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:21.153511 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:21.153585 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:21.153886 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:21.153948 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:21.653690 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:21.653776 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:21.654081 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:22.153312 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:22.153387 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:22.153749 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:22.653581 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:22.653661 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:22.654117 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:23.152715 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:23.152802 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:23.153141 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:23.652867 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:23.652936 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:23.653196 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:23.653241 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:24.152848 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:24.152934 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:24.153277 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:24.653127 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:24.653203 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:24.653560 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:25.153321 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:25.153393 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:25.153662 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:25.652744 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:25.652855 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:25.653239 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:25.653298 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:26.152968 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:26.153049 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:26.153397 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:26.652880 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:26.652963 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:26.653275 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:27.152811 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:27.152888 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:27.153207 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:27.652936 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:27.653013 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:27.653346 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:27.653407 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:28.152886 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:28.152961 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:28.153298 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:28.652829 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:28.652904 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:28.653240 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:29.152818 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:29.152890 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:29.153229 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:29.652862 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:29.652947 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:29.653200 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:30.152832 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:30.152918 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:30.153263 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:30.153321 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:30.652996 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:30.653069 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:30.653387 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:31.152872 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:31.152950 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:31.153277 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:31.295743 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:07:31.365905 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:07:31.365953 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:07:31.366070 2059048 out.go:285] ! Enabling 'default-storageclass' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1219 06:07:31.653338 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:31.653413 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:31.653757 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:32.153415 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:32.153519 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:32.153862 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:32.153934 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:32.653181 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:32.653249 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:32.653512 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:33.152815 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:33.152899 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:33.153193 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:33.652817 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:33.652892 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:33.653195 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:34.152859 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:34.152941 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:34.153251 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:34.653155 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:34.653231 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:34.653574 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:34.653631 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:35.153386 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:35.153461 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:35.153800 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:35.652767 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:35.652837 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:35.653104 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:36.152871 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:36.152949 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:36.153263 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:36.652848 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:36.652927 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:36.653291 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:37.152893 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:37.152978 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:37.153238 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:37.153278 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:37.652927 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:37.653002 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:37.653295 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:38.152985 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:38.153059 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:38.153404 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:38.652889 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:38.652964 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:38.653220 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:39.152836 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:39.152920 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:39.153233 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:39.153294 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:39.652812 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:39.652889 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:39.653215 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:40.152857 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:40.152932 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:40.153187 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:40.653073 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:40.653148 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:40.653479 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:41.152804 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:41.152885 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:41.153180 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:41.652771 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:41.652841 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:41.653154 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:41.653206 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:42.152884 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:42.152963 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:42.153327 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:42.652853 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:42.652951 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:42.653271 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:43.152861 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:43.152931 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:43.153232 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:43.652842 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:43.652918 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:43.653242 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:43.653314 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:44.152994 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:44.153073 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:44.153402 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:44.653413 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:44.653502 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:44.653799 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:45.153668 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:45.153801 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:45.154199 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:45.653080 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:45.653158 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:45.653481 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:45.653538 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:46.153253 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:46.153372 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:46.153620 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:46.653410 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:46.653505 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:46.653901 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:47.147624 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:07:47.153238 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:47.153313 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:47.153618 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:47.207245 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:07:47.207289 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:07:47.207381 2059048 out.go:285] ! Enabling 'storage-provisioner' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1219 06:07:47.212201 2059048 out.go:179] * Enabled addons: 
	I1219 06:07:47.215092 2059048 addons.go:546] duration metric: took 1m52.553895373s for enable addons: enabled=[]
	I1219 06:07:47.652840 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:47.652917 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:47.653177 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:48.152861 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:48.152936 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:48.153274 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:48.153336 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:48.652862 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:48.652941 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:48.653266 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:49.152877 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:49.152950 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:49.153222 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:49.652844 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:49.652940 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:49.653312 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:50.152823 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:50.152906 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:50.153448 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:50.153518 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:50.653031 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:50.653101 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:50.653362 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:51.153113 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:51.153194 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:51.153608 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:51.653414 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:51.653487 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:51.653829 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:52.153249 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:52.153337 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:52.153602 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:52.153645 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:52.653349 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:52.653422 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:52.653735 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:53.153542 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:53.153620 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:53.153960 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:53.653712 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:53.653793 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:53.654088 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:54.153153 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:54.153246 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:54.153650 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:54.153714 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:54.653597 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:54.653675 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:54.654059 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:55.153390 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:55.153470 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:55.153780 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:55.652892 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:55.652968 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:55.653343 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:56.153064 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:56.153144 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:56.153504 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:56.652934 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:56.653001 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:56.653305 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:56.653374 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:57.152823 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:57.152898 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:57.153265 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:57.652979 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:57.653054 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:57.653394 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:58.152861 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:58.152935 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:58.153224 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:58.652841 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:58.652916 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:58.653243 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:59.152863 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:59.152942 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:59.153252 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:59.153305 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:59.652871 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:59.652957 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:59.653221 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:00.152926 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:00.153011 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:00.153341 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:00.653583 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:00.653664 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:00.654050 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:01.153430 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:01.153505 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:01.153842 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:01.153899 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:01.653658 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:01.653734 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:01.654077 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:02.152810 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:02.152894 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:02.153236 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:02.652906 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:02.652992 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:02.653255 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:03.152814 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:03.152911 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:03.153241 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:03.652846 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:03.652946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:03.653311 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:03.653372 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:04.152873 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:04.152947 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:04.153272 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:04.653311 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:04.653393 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:04.653691 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:05.153373 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:05.153449 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:05.153786 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:05.653505 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:05.653577 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:05.653866 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:05.653911 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:06.153684 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:06.153763 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:06.154116 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:06.652849 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:06.652928 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:06.653265 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:07.152801 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:07.152875 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:07.153140 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:07.652821 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:07.652904 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:07.653239 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:08.152989 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:08.153069 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:08.153365 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:08.153412 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:08.652920 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:08.652987 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:08.653255 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:09.152797 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:09.152875 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:09.153215 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:09.652928 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:09.653026 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:09.653367 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:10.152861 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:10.152928 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:10.153183 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:10.653055 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:10.653153 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:10.653481 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:10.653536 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:11.152864 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:11.152945 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:11.153256 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:11.652878 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:11.652952 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:11.653279 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:12.152810 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:12.152885 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:12.153241 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:12.652960 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:12.653057 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:12.653342 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:13.152864 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:13.152943 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:13.153205 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:13.153250 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:13.652804 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:13.652895 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:13.653245 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:14.152963 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:14.153048 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:14.153407 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:14.653382 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:14.653457 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:14.653810 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:15.153570 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:15.153650 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:15.153993 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:15.154055 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:15.652797 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:15.652875 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:15.653205 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:16.152855 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:16.152927 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:16.153181 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:16.652842 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:16.652938 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:16.653237 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:17.152851 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:17.152930 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:17.153255 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:17.652875 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:17.652952 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:17.653273 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:17.653327 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:18.152822 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:18.152897 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:18.153211 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:18.652850 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:18.652929 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:18.653226 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:19.152862 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:19.152939 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:19.153206 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:19.652841 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:19.652943 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:19.653281 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:20.152968 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:20.153059 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:20.153353 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:20.153402 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:20.652914 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:20.652990 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:20.653265 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:21.152823 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:21.152899 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:21.153199 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:21.652903 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:21.652980 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:21.653286 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:22.152864 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:22.152936 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:22.153212 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:22.652848 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:22.652926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:22.653257 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:22.653311 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:23.152987 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:23.153082 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:23.153450 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:23.652860 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:23.652933 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:23.653264 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:24.152875 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:24.152959 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:24.153259 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:24.653188 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:24.653259 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:24.653621 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:24.653676 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:25.152855 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:25.152925 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:25.153189 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:25.653096 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:25.653176 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:25.653514 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:26.153303 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:26.153380 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:26.153718 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:26.653504 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:26.653583 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:26.653866 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:26.653917 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:27.153641 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:27.153723 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:27.154070 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:27.653768 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:27.653851 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:27.654198 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:28.152880 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:28.152951 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:28.153263 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:28.652865 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:28.652944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:28.653286 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:29.152996 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:29.153076 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:29.153423 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:29.153485 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:29.652870 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:29.652937 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:29.653208 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:30.152848 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:30.152931 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:30.153247 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:30.653099 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:30.653178 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:30.653543 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:31.152864 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:31.152933 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:31.153184 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:31.652836 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:31.652916 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:31.653254 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:31.653310 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:32.152845 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:32.152923 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:32.153234 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:32.652869 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:32.652974 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:32.653291 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:33.152813 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:33.152892 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:33.153182 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:33.652873 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:33.652952 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:33.653279 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:33.653339 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:34.152888 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:34.152957 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:34.153206 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:34.653221 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:34.653303 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:34.653662 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:35.153491 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:35.153567 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:35.153923 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:35.653686 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:35.653756 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:35.654034 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:35.654075 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:36.152742 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:36.152852 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:36.153178 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:36.652917 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:36.652991 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:36.653328 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:37.152872 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:37.152944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:37.153269 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:37.652832 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:37.652905 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:37.653225 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:38.152823 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:38.152901 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:38.153256 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:38.153311 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:38.652878 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:38.652945 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:38.653254 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:39.152844 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:39.152917 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:39.153253 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:39.652865 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:39.652948 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:39.653281 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:40.152846 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:40.152936 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:40.153201 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:40.653085 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:40.653160 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:40.653488 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:40.653543 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:41.152787 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:41.152870 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:41.153181 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:41.652770 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:41.652846 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:41.653122 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:42.152887 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:42.152974 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:42.153376 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:42.653103 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:42.653188 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:42.653511 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:42.653570 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:43.152867 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:43.152944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:43.153205 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:43.652856 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:43.652941 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:43.653311 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:44.153027 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:44.153105 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:44.153433 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:44.653459 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:44.653530 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:44.653788 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:44.653840 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:45.153678 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:45.153766 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:45.156105 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=2
	I1219 06:08:45.653114 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:45.653196 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:45.653533 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:46.152871 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:46.152948 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:46.153207 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:46.652877 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:46.652950 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:46.653258 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:47.152995 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:47.153106 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:47.153459 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:47.153515 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:47.652878 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:47.652955 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:47.653273 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:48.152852 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:48.152926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:48.153282 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:48.652874 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:48.652954 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:48.653318 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:49.152869 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:49.152946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:49.153202 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:49.652841 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:49.652924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:49.653230 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:49.653282 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:50.152954 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:50.153027 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:50.153317 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:50.653024 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:50.653102 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:50.653365 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:51.152850 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:51.152924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:51.153219 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:51.652848 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:51.652934 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:51.653275 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:51.653343 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:52.152878 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:52.152957 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:52.153276 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:52.652974 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:52.653059 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:52.653395 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:53.153096 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:53.153174 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:53.153508 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:53.652872 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:53.652951 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:53.653281 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:54.152839 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:54.152916 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:54.153228 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:54.153274 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:54.653199 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:54.653273 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:54.653604 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:55.153420 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:55.153510 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:55.153789 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:55.652811 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:55.652903 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:55.653294 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:56.152856 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:56.152934 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:56.153223 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:56.652860 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:56.652926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:56.653259 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:56.653316 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:57.152833 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:57.152911 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:57.153205 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:57.652906 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:57.652981 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:57.653323 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:58.152864 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:58.152937 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:58.153209 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:58.652840 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:58.652922 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:58.653217 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:59.152905 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:59.152981 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:59.153315 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:59.153381 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:59.652855 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:59.652932 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:59.653183 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:00.152906 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:00.153005 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:00.153357 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:00.653205 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:00.653291 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:00.653625 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:01.153283 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:01.153360 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:01.153628 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:01.153671 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:01.653416 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:01.653497 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:01.653884 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:02.153557 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:02.153633 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:02.154010 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:02.652736 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:02.652830 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:02.653106 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:03.152802 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:03.152877 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:03.153215 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:03.652921 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:03.652999 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:03.653309 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:03.653358 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:04.152885 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:04.152955 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:04.153320 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:04.653344 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:04.653416 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:04.653746 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:05.153560 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:05.153640 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:05.153974 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:05.652768 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:05.652867 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:05.653179 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:06.152859 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:06.152935 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:06.153224 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:06.153272 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:06.652921 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:06.653000 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:06.653306 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:07.152883 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:07.152957 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:07.153227 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:07.652826 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:07.652898 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:07.653243 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:08.152838 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:08.152914 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:08.153262 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:08.153318 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:08.652911 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:08.652979 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:08.653282 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:09.152916 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:09.152986 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:09.153263 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:09.652875 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:09.652954 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:09.653284 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:10.152865 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:10.152942 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:10.153226 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:10.653112 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:10.653192 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:10.653511 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:10.653577 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:11.153350 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:11.153429 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:11.153777 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:11.653085 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:11.653162 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:11.653429 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:12.152806 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:12.152904 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:12.153213 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:12.652905 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:12.652980 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:12.653311 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:13.152912 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:13.152981 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:13.153307 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:13.153367 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:13.653051 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:13.653127 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:13.653415 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:14.152821 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:14.152917 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:14.153215 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:14.653028 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:14.653106 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:14.653360 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:15.152838 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:15.152912 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:15.153210 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:15.653006 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:15.653081 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:15.653415 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:15.653467 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:16.152865 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:16.152934 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:16.153234 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:16.652830 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:16.652908 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:16.653204 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:17.152901 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:17.152974 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:17.153276 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:17.652850 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:17.652923 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:17.653180 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:18.152820 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:18.152893 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:18.153221 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:18.153274 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:18.652906 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:18.652988 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:18.653283 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:19.152887 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:19.152961 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:19.153207 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:19.652913 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:19.652992 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:19.653321 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:20.153018 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:20.153092 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:20.153437 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:20.153501 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:20.653011 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:20.653093 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:20.653372 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:21.153069 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:21.153153 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:21.153484 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:21.652850 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:21.652938 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:21.653322 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:22.153458 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:22.153530 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:22.153790 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:22.153833 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:22.653650 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:22.653724 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:22.654057 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:23.152779 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:23.152853 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:23.153175 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:23.652866 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:23.652964 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:23.653280 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:24.152829 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:24.152918 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:24.153276 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:24.653129 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:24.653203 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:24.653539 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:24.653595 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:25.153181 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:25.153256 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:25.153525 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:25.653492 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:25.653572 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:25.653896 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:26.153709 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:26.153785 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:26.154149 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:26.653439 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:26.653511 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:26.653845 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:26.653912 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:27.153641 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:27.153711 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:27.154059 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:27.653737 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:27.653813 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:27.654171 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:28.152737 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:28.152824 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:28.153136 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:28.652857 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:28.652929 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:28.653267 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:29.152816 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:29.152890 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:29.153304 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:29.153363 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:29.652877 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:29.652949 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:29.653201 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:30.152847 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:30.152926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:30.153286 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:30.653142 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:30.653226 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:30.653576 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:31.152865 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:31.152937 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:31.153239 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:31.652853 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:31.652930 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:31.653285 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:31.653341 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:32.153002 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:32.153077 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:32.153389 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:32.652855 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:32.652928 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:32.653191 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:33.152906 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:33.152985 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:33.153320 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:33.653030 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:33.653112 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:33.653463 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:33.653523 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:34.152867 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:34.152936 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:34.153191 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:34.653266 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:34.653343 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:34.653688 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:35.153480 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:35.153562 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:35.153920 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:35.653700 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:35.653779 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:35.654078 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:35.654124 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:36.152813 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:36.152899 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:36.153244 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:36.652824 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:36.652902 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:36.653244 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:37.152873 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:37.152947 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:37.153200 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:37.652845 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:37.652924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:37.653218 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:38.152831 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:38.152912 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:38.153208 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:38.153253 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:38.652887 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:38.652966 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:38.653228 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:39.152830 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:39.152913 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:39.153278 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:39.652866 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:39.652946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:39.653299 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:40.152861 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:40.152934 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:40.153228 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:40.153287 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:40.653031 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:40.653107 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:40.653447 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:41.152846 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:41.152920 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:41.153249 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:41.652882 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:41.652956 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:41.653222 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:42.152858 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:42.152950 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:42.153290 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:42.153350 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:42.653039 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:42.653114 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:42.653443 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:43.152861 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:43.152929 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:43.153196 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:43.652842 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:43.652917 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:43.653298 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:44.153021 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:44.153098 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:44.153446 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:44.153502 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:44.653394 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:44.653463 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:44.653758 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:45.153716 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:45.153844 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:45.154316 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:45.653443 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:45.653522 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:45.653863 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:46.153623 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:46.153701 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:46.153971 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:46.154014 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:46.653765 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:46.653843 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:46.654187 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:47.152841 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:47.152918 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:47.153244 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:47.652861 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:47.652938 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:47.653190 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:48.152864 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:48.152954 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:48.153355 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:48.653077 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:48.653151 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:48.653475 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:48.653535 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:49.152855 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:49.152929 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:49.153189 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:49.652829 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:49.652905 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:49.653255 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:50.152968 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:50.153052 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:50.153380 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:50.653140 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:50.653211 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:50.653679 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:50.653731 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:51.153473 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:51.153550 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:51.154738 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=1
	I1219 06:09:51.653546 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:51.653618 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:51.653958 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:52.153274 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:52.153349 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:52.153606 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:52.653351 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:52.653426 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:52.653752 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:52.653808 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:53.153430 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:53.153501 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:53.153810 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:53.653040 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:53.653137 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:53.653483 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:54.152833 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:54.152911 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:54.153264 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:54.652950 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:54.653032 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:54.653335 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:55.152873 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:55.152958 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:55.153273 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:55.153315 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:55.653562 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:55.653634 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:55.653988 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:56.152721 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:56.152824 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:56.153183 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:56.652868 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:56.652946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:56.653204 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:57.152895 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:57.152971 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:57.153299 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:57.153359 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:57.652826 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:57.652904 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:57.653235 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:58.152872 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:58.152942 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:58.153268 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:58.652975 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:58.653058 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:58.653396 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:59.153105 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:59.153184 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:59.153474 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:59.153520 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:59.652864 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:59.652944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:59.653267 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:00.152931 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:00.153036 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:00.153356 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:00.653257 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:00.653334 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:00.653658 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:01.153453 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:01.153528 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:01.153794 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:01.153845 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:01.653619 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:01.653697 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:01.654088 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:02.153731 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:02.153810 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:02.154155 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:02.652784 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:02.652868 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:02.653195 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:03.152838 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:03.152917 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:03.153278 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:03.652980 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:03.653056 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:03.653404 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:03.653465 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:04.152871 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:04.152950 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:04.153292 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:04.653187 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:04.653267 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:04.653580 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:05.152837 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:05.152916 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:05.153255 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:05.652984 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:05.653057 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:05.653348 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:06.153037 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:06.153117 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:06.153467 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:06.153522 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:06.653186 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:06.653261 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:06.653599 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:07.152860 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:07.152929 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:07.153189 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:07.652828 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:07.652907 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:07.653267 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:08.152859 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:08.152944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:08.153282 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:08.652877 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:08.652946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:08.653201 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:08.653242 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:09.152822 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:09.152898 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:09.153263 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:09.652859 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:09.652947 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:09.653294 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:10.152871 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:10.152950 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:10.153225 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:10.653128 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:10.653226 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:10.653575 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:10.653636 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:11.153427 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:11.153513 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:11.153854 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:11.653325 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:11.653393 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:11.653695 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:12.153493 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:12.153567 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:12.153867 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:12.653672 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:12.653754 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:12.654079 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:12.654129 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:13.152786 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:13.152870 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:13.153265 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:13.652847 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:13.652923 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:13.653234 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:14.152794 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:14.152866 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:14.153184 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:14.653147 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:14.653224 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:14.653478 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:15.152818 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:15.152898 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:15.153253 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:15.153301 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:15.653095 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:15.653176 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:15.653536 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:16.152872 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:16.152948 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:16.153210 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:16.652894 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:16.652982 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:16.653351 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:17.152883 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:17.152963 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:17.153312 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:17.153367 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:17.652869 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:17.652943 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:17.653235 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:18.152825 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:18.152903 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:18.153264 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:18.652851 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:18.652930 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:18.653272 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:19.152844 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:19.152921 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:19.153179 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:19.652851 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:19.652930 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:19.653284 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:19.653346 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:20.152854 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:20.152932 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:20.153299 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:20.653045 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:20.653125 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:20.653445 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:21.153150 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:21.153227 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:21.153575 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:21.652847 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:21.652931 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:21.653260 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:22.152864 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:22.152940 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:22.153207 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:22.153260 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:22.652895 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:22.652974 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:22.653308 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:23.152830 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:23.152911 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:23.153198 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:23.653479 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:23.653551 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:23.653818 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:24.153710 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:24.153800 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:24.154142 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:24.154201 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:24.653230 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:24.653310 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:24.653643 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:25.153415 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:25.153494 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:25.153825 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:25.652780 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:25.652869 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:25.653238 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:26.152955 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:26.153029 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:26.153332 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:26.652857 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:26.652926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:26.653203 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:26.653244 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:27.152819 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:27.152893 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:27.153237 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:27.652953 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:27.653040 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:27.653394 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:28.152864 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:28.152939 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:28.153198 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:28.652846 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:28.652921 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:28.653239 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:28.653296 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:29.153007 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:29.153109 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:29.153490 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:29.652904 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:29.653002 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:29.653393 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:30.152859 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:30.152941 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:30.153290 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:30.653036 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:30.653110 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:30.653469 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:30.653528 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:31.152866 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:31.152940 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:31.153201 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:31.652841 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:31.652917 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:31.653256 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:32.152981 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:32.153059 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:32.153421 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:32.652862 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:32.652934 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:32.653184 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:33.152815 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:33.152902 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:33.153207 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:33.153256 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:33.652858 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:33.652936 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:33.653273 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:34.153699 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:34.153778 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:34.154156 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:34.652927 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:34.653004 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:34.653344 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:35.152944 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:35.153053 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:35.153407 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:35.153473 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:35.653031 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:35.653108 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:35.653439 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:36.152985 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:36.153058 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:36.153410 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:36.652997 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:36.653074 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:36.653385 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:37.152858 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:37.152927 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:37.153178 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:37.652820 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:37.652898 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:37.653238 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:37.653293 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:38.152835 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:38.152918 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:38.153307 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:38.652893 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:38.652984 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:38.653356 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:39.152830 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:39.152909 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:39.153250 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:39.652830 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:39.652914 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:39.653251 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:40.152861 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:40.152939 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:40.153215 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:40.153267 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:40.653044 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:40.653127 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:40.653472 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:41.152846 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:41.152924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:41.153236 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:41.652860 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:41.652938 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:41.653202 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:42.152888 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:42.152978 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:42.153360 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:42.153425 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:42.653108 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:42.653190 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:42.653535 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:43.153293 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:43.153377 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:43.153699 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:43.653511 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:43.653596 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:43.653946 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:44.153626 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:44.153701 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:44.154058 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:44.154116 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:44.653518 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:44.653586 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:44.653839 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:45.153714 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:45.153803 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:45.154242 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:45.653103 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:45.653182 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:45.653567 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:46.152856 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:46.152937 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:46.153239 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:46.652826 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:46.652901 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:46.653220 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:46.653276 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:47.153000 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:47.153090 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:47.153484 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:47.652874 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:47.652945 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:47.653216 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:48.152829 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:48.152923 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:48.153272 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:48.653001 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:48.653085 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:48.653425 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:48.653480 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:49.152866 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:49.152944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:49.153206 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:49.652909 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:49.652997 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:49.653352 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:50.152932 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:50.153010 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:50.153347 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:50.653023 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:50.653100 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:50.653383 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:51.153061 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:51.153137 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:51.153452 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:51.153512 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:51.653210 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:51.653296 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:51.653657 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:52.153489 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:52.153557 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:52.153876 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:52.653692 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:52.653768 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:52.654090 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:53.152787 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:53.152864 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:53.153157 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:53.652784 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:53.652862 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:53.653125 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:53.653167 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:54.152851 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:54.152939 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:54.153283 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:54.653187 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:54.653265 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:54.653642 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:55.153555 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:55.153638 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:55.153984 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:55.652996 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:55.653078 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:55.653399 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:55.653461 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:56.153144 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:56.153229 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:56.153575 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:56.653124 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:56.653197 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:56.653482 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:57.152912 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:57.152988 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:57.153306 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:57.652848 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:57.652930 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:57.653287 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:58.152883 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:58.152951 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:58.153228 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:58.153273 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:58.652822 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:58.652903 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:58.653236 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:59.152792 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:59.152869 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:59.153185 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:59.652732 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:59.652827 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:59.653140 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:00.152931 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:00.153022 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:00.153339 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:00.153391 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:00.653248 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:00.653323 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:00.653669 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:01.153449 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:01.153521 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:01.153868 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:01.653237 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:01.653336 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:01.653684 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:02.153489 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:02.153575 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:02.153909 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:02.153978 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:02.653742 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:02.653822 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:02.654154 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:03.152847 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:03.152925 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:03.153263 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:03.652860 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:03.652939 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:03.653311 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:04.152862 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:04.152937 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:04.153190 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:04.653274 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:04.653353 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:04.653687 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:04.653753 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:05.153511 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:05.153585 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:05.153947 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:05.652710 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:05.652796 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:05.653061 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:06.152795 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:06.152871 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:06.153232 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:06.652955 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:06.653039 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:06.653379 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:07.152874 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:07.152959 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:07.153220 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:07.153260 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:07.652857 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:07.652932 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:07.653275 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:08.152847 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:08.152921 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:08.153280 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:08.652915 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:08.652997 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:08.653258 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:09.152863 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:09.152938 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:09.153313 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:09.153369 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:09.653051 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:09.653130 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:09.653466 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:10.152899 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:10.152977 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:10.153296 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:10.653043 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:10.653165 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:10.653488 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:11.152868 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:11.152944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:11.153273 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:11.653413 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:11.653483 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:11.653817 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:11.653864 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:12.153612 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:12.153686 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:12.154026 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:12.652742 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:12.652850 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:12.653128 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:13.152820 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:13.152886 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:13.153131 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:13.652843 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:13.652924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:13.653273 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:14.152867 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:14.152944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:14.153275 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:14.153331 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:14.653226 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:14.653309 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:14.653648 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:15.153413 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:15.153488 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:15.153804 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:15.652744 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:15.652844 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:15.653142 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:16.152784 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:16.152853 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:16.153159 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:16.652816 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:16.652891 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:16.653186 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:16.653233 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:17.152845 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:17.152926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:17.153251 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:17.652870 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:17.652936 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:17.653201 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:18.152833 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:18.152914 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:18.153264 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:18.652945 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:18.653025 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:18.653318 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:18.653380 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:19.152875 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:19.152949 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:19.153231 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:19.652844 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:19.652920 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:19.653256 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:20.152860 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:20.152935 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:20.153299 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:20.653014 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:20.653092 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:20.653351 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:21.152845 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:21.152918 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:21.153277 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:21.153341 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:21.653010 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:21.653087 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:21.653481 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:22.152850 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:22.152924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:22.153186 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:22.652828 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:22.652906 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:22.653231 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:23.152837 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:23.152923 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:23.153272 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:23.652867 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:23.652942 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:23.653201 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:23.653241 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:24.152900 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:24.152984 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:24.153313 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:24.653189 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:24.653270 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:24.653611 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:25.152919 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:25.152989 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:25.153298 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:25.653028 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:25.653101 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:25.653438 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:25.653492 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:26.153176 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:26.153256 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:26.153570 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:26.652912 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:26.652986 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:26.653378 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:27.152845 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:27.152921 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:27.153259 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:27.652839 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:27.652922 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:27.653245 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:28.152867 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:28.152938 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:28.153233 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:28.153276 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:28.652833 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:28.652907 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:28.653249 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:29.152989 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:29.153064 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:29.153396 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:29.652869 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:29.652941 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:29.653239 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:30.152845 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:30.152926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:30.153258 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:30.153317 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:30.653045 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:30.653118 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:30.653429 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:31.152869 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:31.152943 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:31.153207 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:31.652855 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:31.652930 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:31.653292 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:32.152836 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:32.152919 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:32.153241 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:32.652864 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:32.652940 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:32.653216 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:32.653263 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:33.152791 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:33.152875 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:33.153206 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:33.652896 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:33.652977 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:33.653320 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:34.152876 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:34.152959 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:34.153289 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:34.653352 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:34.653430 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:34.653807 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:34.653869 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:35.153637 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:35.153718 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:35.154044 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:35.652946 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:35.653021 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:35.653340 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:36.152965 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:36.153049 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:36.153384 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:36.653133 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:36.653213 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:36.653535 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:37.153283 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:37.153355 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:37.153669 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:37.153731 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:37.653516 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:37.653597 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:37.653938 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:38.153755 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:38.153833 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:38.154248 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:38.652881 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:38.652956 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:38.653272 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:39.152844 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:39.152925 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:39.153296 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:39.652881 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:39.652964 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:39.653289 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:39.653347 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:40.152873 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:40.152948 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:40.153212 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:40.653026 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:40.653107 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:40.653447 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:41.152846 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:41.152921 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:41.153249 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:41.652949 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:41.653030 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:41.653297 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:42.152907 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:42.153021 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:42.153435 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:42.153505 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:42.653182 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:42.653258 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:42.653594 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:43.152852 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:43.152926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:43.153183 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:43.652858 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:43.652938 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:43.653276 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:44.152855 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:44.152938 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:44.153264 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:44.653236 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:44.653312 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:44.653574 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:44.653614 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:45.153508 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:45.153630 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:45.154114 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:45.653037 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:45.653120 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:45.653478 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:46.152854 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:46.152928 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:46.153280 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:46.652857 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:46.652941 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:46.653286 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:47.152995 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:47.153070 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:47.153369 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:47.153415 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:47.652884 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:47.652954 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:47.653305 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:48.152817 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:48.152895 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:48.153237 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:48.652848 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:48.652925 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:48.653243 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:49.152858 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:49.152943 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:49.153213 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:49.652807 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:49.652884 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:49.653230 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:49.653285 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:50.152972 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:50.153050 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:50.153344 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:50.653187 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:50.653264 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:50.653522 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:51.153213 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:51.153287 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:51.153583 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:51.653360 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:51.653435 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:51.653779 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:51.653840 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:52.153070 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:52.153143 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:52.153403 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:52.653101 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:52.653190 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:52.653483 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:53.152800 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:53.152878 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:53.153196 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:53.652868 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:53.652937 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:53.653274 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:54.152820 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:54.152901 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:54.153215 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:54.153274 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:54.653291 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:54.653363 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:54.653706 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:55.152998 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:55.153089 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:55.153429 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:55.653064 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:55.653125 2059048 node_ready.go:38] duration metric: took 6m0.000540604s for node "functional-006924" to be "Ready" ...
	I1219 06:11:55.656290 2059048 out.go:203] 
	W1219 06:11:55.659114 2059048 out.go:285] X Exiting due to GUEST_START: failed to start node: wait 6m0s for node: waiting for node to be ready: WaitNodeCondition: context deadline exceeded
	W1219 06:11:55.659135 2059048 out.go:285] * 
	W1219 06:11:55.661307 2059048 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1219 06:11:55.664349 2059048 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 19 06:05:53 functional-006924 containerd[5249]: time="2025-12-19T06:05:53.106546939Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1
	Dec 19 06:05:53 functional-006924 containerd[5249]: time="2025-12-19T06:05:53.106615724Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1
	Dec 19 06:05:53 functional-006924 containerd[5249]: time="2025-12-19T06:05:53.106710289Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1
	Dec 19 06:05:53 functional-006924 containerd[5249]: time="2025-12-19T06:05:53.106798117Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1
	Dec 19 06:05:53 functional-006924 containerd[5249]: time="2025-12-19T06:05:53.106860600Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 19 06:05:53 functional-006924 containerd[5249]: time="2025-12-19T06:05:53.106921951Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 19 06:05:53 functional-006924 containerd[5249]: time="2025-12-19T06:05:53.106979256Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1
	Dec 19 06:05:53 functional-006924 containerd[5249]: time="2025-12-19T06:05:53.107036249Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1
	Dec 19 06:05:53 functional-006924 containerd[5249]: time="2025-12-19T06:05:53.107120804Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1
	Dec 19 06:05:53 functional-006924 containerd[5249]: time="2025-12-19T06:05:53.107220809Z" level=info msg="Connect containerd service"
	Dec 19 06:05:53 functional-006924 containerd[5249]: time="2025-12-19T06:05:53.107615078Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this"
	Dec 19 06:05:53 functional-006924 containerd[5249]: time="2025-12-19T06:05:53.108301575Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config"
	Dec 19 06:05:53 functional-006924 containerd[5249]: time="2025-12-19T06:05:53.121412760Z" level=info msg="Start subscribing containerd event"
	Dec 19 06:05:53 functional-006924 containerd[5249]: time="2025-12-19T06:05:53.121497799Z" level=info msg="Start recovering state"
	Dec 19 06:05:53 functional-006924 containerd[5249]: time="2025-12-19T06:05:53.122272830Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc
	Dec 19 06:05:53 functional-006924 containerd[5249]: time="2025-12-19T06:05:53.122475040Z" level=info msg=serving... address=/run/containerd/containerd.sock
	Dec 19 06:05:53 functional-006924 containerd[5249]: time="2025-12-19T06:05:53.164704607Z" level=info msg="Start event monitor"
	Dec 19 06:05:53 functional-006924 containerd[5249]: time="2025-12-19T06:05:53.164941434Z" level=info msg="Start cni network conf syncer for default"
	Dec 19 06:05:53 functional-006924 containerd[5249]: time="2025-12-19T06:05:53.165030970Z" level=info msg="Start streaming server"
	Dec 19 06:05:53 functional-006924 containerd[5249]: time="2025-12-19T06:05:53.165132911Z" level=info msg="Registered namespace \"k8s.io\" with NRI"
	Dec 19 06:05:53 functional-006924 containerd[5249]: time="2025-12-19T06:05:53.165363388Z" level=info msg="runtime interface starting up..."
	Dec 19 06:05:53 functional-006924 containerd[5249]: time="2025-12-19T06:05:53.165439754Z" level=info msg="starting plugins..."
	Dec 19 06:05:53 functional-006924 containerd[5249]: time="2025-12-19T06:05:53.165505280Z" level=info msg="Synchronizing NRI (plugin) with current runtime state"
	Dec 19 06:05:53 functional-006924 systemd[1]: Started containerd.service - containerd container runtime.
	Dec 19 06:05:53 functional-006924 containerd[5249]: time="2025-12-19T06:05:53.167961084Z" level=info msg="containerd successfully booted in 0.088140s"
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:11:57.474906    8447 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:11:57.475456    8447 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:11:57.477223    8447 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:11:57.477790    8447 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:11:57.479381    8447 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec19 04:47] overlayfs: idmapped layers are currently not supported
	[Dec19 04:48] overlayfs: idmapped layers are currently not supported
	[Dec19 04:49] overlayfs: idmapped layers are currently not supported
	[Dec19 04:51] overlayfs: idmapped layers are currently not supported
	[Dec19 04:53] overlayfs: idmapped layers are currently not supported
	[Dec19 05:03] overlayfs: idmapped layers are currently not supported
	[Dec19 05:04] overlayfs: idmapped layers are currently not supported
	[Dec19 05:05] overlayfs: idmapped layers are currently not supported
	[Dec19 05:06] overlayfs: idmapped layers are currently not supported
	[ +12.793339] overlayfs: idmapped layers are currently not supported
	[Dec19 05:07] overlayfs: idmapped layers are currently not supported
	[Dec19 05:08] overlayfs: idmapped layers are currently not supported
	[Dec19 05:09] overlayfs: idmapped layers are currently not supported
	[Dec19 05:10] overlayfs: idmapped layers are currently not supported
	[Dec19 05:11] overlayfs: idmapped layers are currently not supported
	[Dec19 05:13] overlayfs: idmapped layers are currently not supported
	[Dec19 05:14] overlayfs: idmapped layers are currently not supported
	[Dec19 05:32] overlayfs: idmapped layers are currently not supported
	[Dec19 05:33] overlayfs: idmapped layers are currently not supported
	[Dec19 05:35] overlayfs: idmapped layers are currently not supported
	[Dec19 05:36] overlayfs: idmapped layers are currently not supported
	[Dec19 05:38] overlayfs: idmapped layers are currently not supported
	[Dec19 05:39] overlayfs: idmapped layers are currently not supported
	[Dec19 05:40] overlayfs: idmapped layers are currently not supported
	[Dec19 05:42] kauditd_printk_skb: 8 callbacks suppressed
	
	
	==> kernel <==
	 06:11:57 up 10:54,  0 user,  load average: 0.20, 0.29, 0.73
	Linux functional-006924 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 19 06:11:54 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 06:11:55 functional-006924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 808.
	Dec 19 06:11:55 functional-006924 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:11:55 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:11:55 functional-006924 kubelet[8330]: E1219 06:11:55.199580    8330 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 06:11:55 functional-006924 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 06:11:55 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 06:11:55 functional-006924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 809.
	Dec 19 06:11:55 functional-006924 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:11:55 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:11:55 functional-006924 kubelet[8336]: E1219 06:11:55.961831    8336 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 06:11:55 functional-006924 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 06:11:55 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 06:11:56 functional-006924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 810.
	Dec 19 06:11:56 functional-006924 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:11:56 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:11:56 functional-006924 kubelet[8357]: E1219 06:11:56.721706    8357 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 06:11:56 functional-006924 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 06:11:56 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 06:11:57 functional-006924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 811.
	Dec 19 06:11:57 functional-006924 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:11:57 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:11:57 functional-006924 kubelet[8441]: E1219 06:11:57.462990    8441 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 06:11:57 functional-006924 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 06:11:57 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:263: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-006924 -n functional-006924
helpers_test.go:263: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-006924 -n functional-006924: exit status 2 (396.5677ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:263: status error: exit status 2 (may be ok)
helpers_test.go:265: "functional-006924" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/SoftStart (367.95s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/KubectlGetPods (2.67s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/KubectlGetPods
functional_test.go:711: (dbg) Run:  kubectl --context functional-006924 get po -A
functional_test.go:711: (dbg) Non-zero exit: kubectl --context functional-006924 get po -A: exit status 1 (59.531519ms)

                                                
                                                
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:713: failed to get kubectl pods: args "kubectl --context functional-006924 get po -A" : exit status 1
functional_test.go:717: expected stderr to be empty but got *"The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?\n"*: args "kubectl --context functional-006924 get po -A"
functional_test.go:720: expected stdout to include *kube-system* but got *""*. args: "kubectl --context functional-006924 get po -A"
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:224: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/KubectlGetPods]: network settings <======
helpers_test.go:231: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:239: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/KubectlGetPods]: docker inspect <======
helpers_test.go:240: (dbg) Run:  docker inspect functional-006924
helpers_test.go:244: (dbg) docker inspect functional-006924:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6",
	        "Created": "2025-12-19T05:57:32.987616309Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 2053574,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-19T05:57:33.050252475Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:1411dfa4fea1291ce69fcd55acb99f3fbff3e701cee30fdd4f0b2561ac0ef6b0",
	        "ResolvConfPath": "/var/lib/docker/containers/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6/hostname",
	        "HostsPath": "/var/lib/docker/containers/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6/hosts",
	        "LogPath": "/var/lib/docker/containers/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6-json.log",
	        "Name": "/functional-006924",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-006924:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-006924",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6",
	                "LowerDir": "/var/lib/docker/overlay2/37e64f46ab514c62069e4338bcac4816059c7746e8935882d7526ec5f3a01f73-init/diff:/var/lib/docker/overlay2/00358d85eab3b52f9d297862c5ac97673efd866f7bb8f8781bf0c1744f50abc5/diff",
	                "MergedDir": "/var/lib/docker/overlay2/37e64f46ab514c62069e4338bcac4816059c7746e8935882d7526ec5f3a01f73/merged",
	                "UpperDir": "/var/lib/docker/overlay2/37e64f46ab514c62069e4338bcac4816059c7746e8935882d7526ec5f3a01f73/diff",
	                "WorkDir": "/var/lib/docker/overlay2/37e64f46ab514c62069e4338bcac4816059c7746e8935882d7526ec5f3a01f73/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-006924",
	                "Source": "/var/lib/docker/volumes/functional-006924/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-006924",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-006924",
	                "name.minikube.sigs.k8s.io": "functional-006924",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "c06ab2bd44169716d410789ed39ed6e7c04e20cbf7fddb96691439282b9c97ca",
	            "SandboxKey": "/var/run/docker/netns/c06ab2bd4416",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34704"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34705"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34708"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34706"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34707"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-006924": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "42:2f:87:6a:a8:7b",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "f63e8dc2cff83663f8a4d14108f192e61e457410fa4fc720cd9630dbf354815d",
	                    "EndpointID": "aa2b1cbd90d5c1f6130481423d97f82d974d4197e41ad0dbe3b7e51b22c8b4cc",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-006924",
	                        "651d0d6ef1db"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:248: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-006924 -n functional-006924
helpers_test.go:248: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-006924 -n functional-006924: exit status 2 (312.631731ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:248: status error: exit status 2 (may be ok)
helpers_test.go:253: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/KubectlGetPods FAILED: start of post-mortem logs <<<
helpers_test.go:254: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/KubectlGetPods]: minikube logs <======
helpers_test.go:256: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 logs -n 25
helpers_test.go:256: (dbg) Done: out/minikube-linux-arm64 -p functional-006924 logs -n 25: (1.007560568s)
helpers_test.go:261: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/KubectlGetPods logs: 
-- stdout --
	
	==> Audit <==
	┌────────────────┬───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│    COMMAND     │                                                                         ARGS                                                                          │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├────────────────┼───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ image          │ functional-125117 image rm kicbase/echo-server:functional-125117 --alsologtostderr                                                                    │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image ls                                                                                                                            │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image load /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr                             │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image ls                                                                                                                            │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image save --daemon kicbase/echo-server:functional-125117 --alsologtostderr                                                         │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ ssh            │ functional-125117 ssh sudo cat /etc/ssl/certs/2000386.pem                                                                                             │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ ssh            │ functional-125117 ssh sudo cat /usr/share/ca-certificates/2000386.pem                                                                                 │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ ssh            │ functional-125117 ssh sudo cat /etc/ssl/certs/51391683.0                                                                                              │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ ssh            │ functional-125117 ssh sudo cat /etc/ssl/certs/20003862.pem                                                                                            │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ ssh            │ functional-125117 ssh sudo cat /usr/share/ca-certificates/20003862.pem                                                                                │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ ssh            │ functional-125117 ssh sudo cat /etc/ssl/certs/3ec20f2e.0                                                                                              │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ ssh            │ functional-125117 ssh sudo cat /etc/test/nested/copy/2000386/hosts                                                                                    │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image ls --format short --alsologtostderr                                                                                           │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image ls --format yaml --alsologtostderr                                                                                            │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ ssh            │ functional-125117 ssh pgrep buildkitd                                                                                                                 │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │                     │
	│ image          │ functional-125117 image build -t localhost/my-image:functional-125117 testdata/build --alsologtostderr                                                │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image ls                                                                                                                            │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image ls --format json --alsologtostderr                                                                                            │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image ls --format table --alsologtostderr                                                                                           │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ update-context │ functional-125117 update-context --alsologtostderr -v=2                                                                                               │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ update-context │ functional-125117 update-context --alsologtostderr -v=2                                                                                               │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ update-context │ functional-125117 update-context --alsologtostderr -v=2                                                                                               │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ delete         │ -p functional-125117                                                                                                                                  │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ start          │ -p functional-006924 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-rc.1 │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │                     │
	│ start          │ -p functional-006924 --alsologtostderr -v=8                                                                                                           │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:05 UTC │                     │
	└────────────────┴───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/19 06:05:50
	Running on machine: ip-172-31-30-239
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1219 06:05:50.537990 2059048 out.go:360] Setting OutFile to fd 1 ...
	I1219 06:05:50.538849 2059048 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 06:05:50.538894 2059048 out.go:374] Setting ErrFile to fd 2...
	I1219 06:05:50.538913 2059048 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 06:05:50.539188 2059048 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22230-1998525/.minikube/bin
	I1219 06:05:50.539610 2059048 out.go:368] Setting JSON to false
	I1219 06:05:50.540502 2059048 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":38897,"bootTime":1766085454,"procs":155,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1219 06:05:50.540601 2059048 start.go:143] virtualization:  
	I1219 06:05:50.544140 2059048 out.go:179] * [functional-006924] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1219 06:05:50.547152 2059048 out.go:179]   - MINIKUBE_LOCATION=22230
	I1219 06:05:50.547218 2059048 notify.go:221] Checking for updates...
	I1219 06:05:50.550931 2059048 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1219 06:05:50.553869 2059048 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22230-1998525/kubeconfig
	I1219 06:05:50.556730 2059048 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22230-1998525/.minikube
	I1219 06:05:50.559634 2059048 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1219 06:05:50.562450 2059048 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1219 06:05:50.565702 2059048 config.go:182] Loaded profile config "functional-006924": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1219 06:05:50.565828 2059048 driver.go:422] Setting default libvirt URI to qemu:///system
	I1219 06:05:50.590709 2059048 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1219 06:05:50.590846 2059048 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1219 06:05:50.653898 2059048 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-19 06:05:50.644590744 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1219 06:05:50.654020 2059048 docker.go:319] overlay module found
	I1219 06:05:50.657204 2059048 out.go:179] * Using the docker driver based on existing profile
	I1219 06:05:50.660197 2059048 start.go:309] selected driver: docker
	I1219 06:05:50.660214 2059048 start.go:928] validating driver "docker" against &{Name:functional-006924 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APISer
verName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableC
oreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1219 06:05:50.660310 2059048 start.go:939] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1219 06:05:50.660408 2059048 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1219 06:05:50.713439 2059048 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-19 06:05:50.704333478 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1219 06:05:50.713872 2059048 cni.go:84] Creating CNI manager for ""
	I1219 06:05:50.713935 2059048 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1219 06:05:50.713992 2059048 start.go:353] cluster config:
	{Name:functional-006924 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Con
tainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath:
StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1219 06:05:50.717210 2059048 out.go:179] * Starting "functional-006924" primary control-plane node in "functional-006924" cluster
	I1219 06:05:50.719980 2059048 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1219 06:05:50.722935 2059048 out.go:179] * Pulling base image v0.0.48-1765966054-22186 ...
	I1219 06:05:50.726070 2059048 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1219 06:05:50.726124 2059048 preload.go:203] Found local preload: /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4
	I1219 06:05:50.726135 2059048 cache.go:65] Caching tarball of preloaded images
	I1219 06:05:50.726179 2059048 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon
	I1219 06:05:50.726225 2059048 preload.go:238] Found /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4 in cache, skipping download
	I1219 06:05:50.726236 2059048 cache.go:68] Finished verifying existence of preloaded tar for v1.35.0-rc.1 on containerd
	I1219 06:05:50.726339 2059048 profile.go:143] Saving config to /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/config.json ...
	I1219 06:05:50.745888 2059048 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon, skipping pull
	I1219 06:05:50.745915 2059048 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 exists in daemon, skipping load
	I1219 06:05:50.745932 2059048 cache.go:243] Successfully downloaded all kic artifacts
	I1219 06:05:50.745963 2059048 start.go:360] acquireMachinesLock for functional-006924: {Name:mkc84f48e83d18024791d45db780f3ccd746613a Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1219 06:05:50.746023 2059048 start.go:364] duration metric: took 37.752µs to acquireMachinesLock for "functional-006924"
	I1219 06:05:50.746049 2059048 start.go:96] Skipping create...Using existing machine configuration
	I1219 06:05:50.746059 2059048 fix.go:54] fixHost starting: 
	I1219 06:05:50.746334 2059048 cli_runner.go:164] Run: docker container inspect functional-006924 --format={{.State.Status}}
	I1219 06:05:50.762745 2059048 fix.go:112] recreateIfNeeded on functional-006924: state=Running err=<nil>
	W1219 06:05:50.762777 2059048 fix.go:138] unexpected machine state, will restart: <nil>
	I1219 06:05:50.765990 2059048 out.go:252] * Updating the running docker "functional-006924" container ...
	I1219 06:05:50.766020 2059048 machine.go:94] provisionDockerMachine start ...
	I1219 06:05:50.766101 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:50.782668 2059048 main.go:144] libmachine: Using SSH client type: native
	I1219 06:05:50.783000 2059048 main.go:144] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db5e0] 0x3ddae0 <nil>  [] 0s} 127.0.0.1 34704 <nil> <nil>}
	I1219 06:05:50.783017 2059048 main.go:144] libmachine: About to run SSH command:
	hostname
	I1219 06:05:50.940618 2059048 main.go:144] libmachine: SSH cmd err, output: <nil>: functional-006924
	
	I1219 06:05:50.940641 2059048 ubuntu.go:182] provisioning hostname "functional-006924"
	I1219 06:05:50.940708 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:50.964854 2059048 main.go:144] libmachine: Using SSH client type: native
	I1219 06:05:50.965181 2059048 main.go:144] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db5e0] 0x3ddae0 <nil>  [] 0s} 127.0.0.1 34704 <nil> <nil>}
	I1219 06:05:50.965199 2059048 main.go:144] libmachine: About to run SSH command:
	sudo hostname functional-006924 && echo "functional-006924" | sudo tee /etc/hostname
	I1219 06:05:51.129720 2059048 main.go:144] libmachine: SSH cmd err, output: <nil>: functional-006924
	
	I1219 06:05:51.129816 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:51.147357 2059048 main.go:144] libmachine: Using SSH client type: native
	I1219 06:05:51.147663 2059048 main.go:144] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db5e0] 0x3ddae0 <nil>  [] 0s} 127.0.0.1 34704 <nil> <nil>}
	I1219 06:05:51.147686 2059048 main.go:144] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-006924' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-006924/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-006924' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1219 06:05:51.301923 2059048 main.go:144] libmachine: SSH cmd err, output: <nil>: 
	I1219 06:05:51.301949 2059048 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/22230-1998525/.minikube CaCertPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/22230-1998525/.minikube}
	I1219 06:05:51.301977 2059048 ubuntu.go:190] setting up certificates
	I1219 06:05:51.301985 2059048 provision.go:84] configureAuth start
	I1219 06:05:51.302047 2059048 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-006924
	I1219 06:05:51.323653 2059048 provision.go:143] copyHostCerts
	I1219 06:05:51.323700 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.pem
	I1219 06:05:51.323742 2059048 exec_runner.go:144] found /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.pem, removing ...
	I1219 06:05:51.323756 2059048 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.pem
	I1219 06:05:51.323832 2059048 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.pem (1078 bytes)
	I1219 06:05:51.323915 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/22230-1998525/.minikube/cert.pem
	I1219 06:05:51.323932 2059048 exec_runner.go:144] found /home/jenkins/minikube-integration/22230-1998525/.minikube/cert.pem, removing ...
	I1219 06:05:51.323937 2059048 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22230-1998525/.minikube/cert.pem
	I1219 06:05:51.323964 2059048 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/22230-1998525/.minikube/cert.pem (1123 bytes)
	I1219 06:05:51.324003 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/22230-1998525/.minikube/key.pem
	I1219 06:05:51.324018 2059048 exec_runner.go:144] found /home/jenkins/minikube-integration/22230-1998525/.minikube/key.pem, removing ...
	I1219 06:05:51.324022 2059048 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22230-1998525/.minikube/key.pem
	I1219 06:05:51.324044 2059048 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/22230-1998525/.minikube/key.pem (1671 bytes)
	I1219 06:05:51.324090 2059048 provision.go:117] generating server cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca-key.pem org=jenkins.functional-006924 san=[127.0.0.1 192.168.49.2 functional-006924 localhost minikube]
	I1219 06:05:51.441821 2059048 provision.go:177] copyRemoteCerts
	I1219 06:05:51.441886 2059048 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1219 06:05:51.441926 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:51.459787 2059048 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:05:51.570296 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I1219 06:05:51.570372 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1219 06:05:51.588363 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server.pem -> /etc/docker/server.pem
	I1219 06:05:51.588477 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1219 06:05:51.605684 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I1219 06:05:51.605798 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1219 06:05:51.623473 2059048 provision.go:87] duration metric: took 321.473451ms to configureAuth
	I1219 06:05:51.623556 2059048 ubuntu.go:206] setting minikube options for container-runtime
	I1219 06:05:51.623741 2059048 config.go:182] Loaded profile config "functional-006924": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1219 06:05:51.623756 2059048 machine.go:97] duration metric: took 857.728961ms to provisionDockerMachine
	I1219 06:05:51.623765 2059048 start.go:293] postStartSetup for "functional-006924" (driver="docker")
	I1219 06:05:51.623788 2059048 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1219 06:05:51.623849 2059048 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1219 06:05:51.623892 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:51.641371 2059048 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:05:51.760842 2059048 ssh_runner.go:195] Run: cat /etc/os-release
	I1219 06:05:51.764225 2059048 command_runner.go:130] > PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	I1219 06:05:51.764245 2059048 command_runner.go:130] > NAME="Debian GNU/Linux"
	I1219 06:05:51.764250 2059048 command_runner.go:130] > VERSION_ID="12"
	I1219 06:05:51.764255 2059048 command_runner.go:130] > VERSION="12 (bookworm)"
	I1219 06:05:51.764259 2059048 command_runner.go:130] > VERSION_CODENAME=bookworm
	I1219 06:05:51.764263 2059048 command_runner.go:130] > ID=debian
	I1219 06:05:51.764268 2059048 command_runner.go:130] > HOME_URL="https://www.debian.org/"
	I1219 06:05:51.764273 2059048 command_runner.go:130] > SUPPORT_URL="https://www.debian.org/support"
	I1219 06:05:51.764281 2059048 command_runner.go:130] > BUG_REPORT_URL="https://bugs.debian.org/"
	I1219 06:05:51.764323 2059048 main.go:144] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1219 06:05:51.764339 2059048 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1219 06:05:51.764350 2059048 filesync.go:126] Scanning /home/jenkins/minikube-integration/22230-1998525/.minikube/addons for local assets ...
	I1219 06:05:51.764404 2059048 filesync.go:126] Scanning /home/jenkins/minikube-integration/22230-1998525/.minikube/files for local assets ...
	I1219 06:05:51.764485 2059048 filesync.go:149] local asset: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem -> 20003862.pem in /etc/ssl/certs
	I1219 06:05:51.764491 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem -> /etc/ssl/certs/20003862.pem
	I1219 06:05:51.764572 2059048 filesync.go:149] local asset: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/test/nested/copy/2000386/hosts -> hosts in /etc/test/nested/copy/2000386
	I1219 06:05:51.764576 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/test/nested/copy/2000386/hosts -> /etc/test/nested/copy/2000386/hosts
	I1219 06:05:51.764619 2059048 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/2000386
	I1219 06:05:51.772196 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem --> /etc/ssl/certs/20003862.pem (1708 bytes)
	I1219 06:05:51.790438 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/test/nested/copy/2000386/hosts --> /etc/test/nested/copy/2000386/hosts (40 bytes)
	I1219 06:05:51.808099 2059048 start.go:296] duration metric: took 184.303334ms for postStartSetup
	I1219 06:05:51.808203 2059048 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1219 06:05:51.808277 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:51.825566 2059048 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:05:51.929610 2059048 command_runner.go:130] > 14%
	I1219 06:05:51.930200 2059048 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1219 06:05:51.934641 2059048 command_runner.go:130] > 169G
	I1219 06:05:51.935117 2059048 fix.go:56] duration metric: took 1.189053781s for fixHost
	I1219 06:05:51.935139 2059048 start.go:83] releasing machines lock for "functional-006924", held for 1.189101272s
	I1219 06:05:51.935225 2059048 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-006924
	I1219 06:05:51.954055 2059048 ssh_runner.go:195] Run: cat /version.json
	I1219 06:05:51.954105 2059048 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1219 06:05:51.954110 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:51.954164 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:51.979421 2059048 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:05:51.998216 2059048 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:05:52.088735 2059048 command_runner.go:130] > {"iso_version": "v1.37.0-1765846775-22141", "kicbase_version": "v0.0.48-1765966054-22186", "minikube_version": "v1.37.0", "commit": "c344550999bcbb78f38b2df057224788bb2d30b2"}
	I1219 06:05:52.088901 2059048 ssh_runner.go:195] Run: systemctl --version
	I1219 06:05:52.184102 2059048 command_runner.go:130] > <a href="https://github.com/kubernetes/registry.k8s.io">Temporary Redirect</a>.
	I1219 06:05:52.186843 2059048 command_runner.go:130] > systemd 252 (252.39-1~deb12u1)
	I1219 06:05:52.186921 2059048 command_runner.go:130] > +PAM +AUDIT +SELINUX +APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT +QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified
	I1219 06:05:52.187021 2059048 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I1219 06:05:52.191424 2059048 command_runner.go:130] ! stat: cannot statx '/etc/cni/net.d/*loopback.conf*': No such file or directory
	W1219 06:05:52.191590 2059048 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1219 06:05:52.191669 2059048 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1219 06:05:52.199647 2059048 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1219 06:05:52.199671 2059048 start.go:496] detecting cgroup driver to use...
	I1219 06:05:52.199702 2059048 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1219 06:05:52.199771 2059048 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1219 06:05:52.215530 2059048 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1219 06:05:52.228927 2059048 docker.go:218] disabling cri-docker service (if available) ...
	I1219 06:05:52.229039 2059048 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1219 06:05:52.245166 2059048 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1219 06:05:52.258582 2059048 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1219 06:05:52.378045 2059048 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1219 06:05:52.513092 2059048 docker.go:234] disabling docker service ...
	I1219 06:05:52.513180 2059048 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1219 06:05:52.528704 2059048 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1219 06:05:52.542109 2059048 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1219 06:05:52.652456 2059048 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1219 06:05:52.767269 2059048 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1219 06:05:52.781039 2059048 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1219 06:05:52.797281 2059048 command_runner.go:130] > runtime-endpoint: unix:///run/containerd/containerd.sock
	I1219 06:05:52.797396 2059048 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1219 06:05:52.807020 2059048 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1219 06:05:52.816571 2059048 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1219 06:05:52.816661 2059048 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1219 06:05:52.826225 2059048 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1219 06:05:52.835109 2059048 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1219 06:05:52.843741 2059048 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1219 06:05:52.852504 2059048 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1219 06:05:52.860160 2059048 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1219 06:05:52.868883 2059048 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1219 06:05:52.877906 2059048 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1219 06:05:52.887403 2059048 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1219 06:05:52.894024 2059048 command_runner.go:130] > net.bridge.bridge-nf-call-iptables = 1
	I1219 06:05:52.894921 2059048 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1219 06:05:52.902164 2059048 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1219 06:05:53.021703 2059048 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1219 06:05:53.168216 2059048 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1219 06:05:53.168331 2059048 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1219 06:05:53.171951 2059048 command_runner.go:130] >   File: /run/containerd/containerd.sock
	I1219 06:05:53.172022 2059048 command_runner.go:130] >   Size: 0         	Blocks: 0          IO Block: 4096   socket
	I1219 06:05:53.172043 2059048 command_runner.go:130] > Device: 0,72	Inode: 1614        Links: 1
	I1219 06:05:53.172065 2059048 command_runner.go:130] > Access: (0660/srw-rw----)  Uid: (    0/    root)   Gid: (    0/    root)
	I1219 06:05:53.172084 2059048 command_runner.go:130] > Access: 2025-12-19 06:05:53.119867628 +0000
	I1219 06:05:53.172112 2059048 command_runner.go:130] > Modify: 2025-12-19 06:05:53.119867628 +0000
	I1219 06:05:53.172131 2059048 command_runner.go:130] > Change: 2025-12-19 06:05:53.119867628 +0000
	I1219 06:05:53.172148 2059048 command_runner.go:130] >  Birth: -
	I1219 06:05:53.172331 2059048 start.go:564] Will wait 60s for crictl version
	I1219 06:05:53.172432 2059048 ssh_runner.go:195] Run: which crictl
	I1219 06:05:53.175887 2059048 command_runner.go:130] > /usr/local/bin/crictl
	I1219 06:05:53.176199 2059048 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1219 06:05:53.203136 2059048 command_runner.go:130] > Version:  0.1.0
	I1219 06:05:53.203389 2059048 command_runner.go:130] > RuntimeName:  containerd
	I1219 06:05:53.203588 2059048 command_runner.go:130] > RuntimeVersion:  v2.2.0
	I1219 06:05:53.203784 2059048 command_runner.go:130] > RuntimeApiVersion:  v1
	I1219 06:05:53.207710 2059048 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.2.0
	RuntimeApiVersion:  v1
	I1219 06:05:53.207845 2059048 ssh_runner.go:195] Run: containerd --version
	I1219 06:05:53.235328 2059048 command_runner.go:130] > containerd containerd.io v2.2.0 1c4457e00facac03ce1d75f7b6777a7a851e5c41
	I1219 06:05:53.237219 2059048 ssh_runner.go:195] Run: containerd --version
	I1219 06:05:53.254490 2059048 command_runner.go:130] > containerd containerd.io v2.2.0 1c4457e00facac03ce1d75f7b6777a7a851e5c41
	I1219 06:05:53.262101 2059048 out.go:179] * Preparing Kubernetes v1.35.0-rc.1 on containerd 2.2.0 ...
	I1219 06:05:53.264978 2059048 cli_runner.go:164] Run: docker network inspect functional-006924 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1219 06:05:53.280549 2059048 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1219 06:05:53.284647 2059048 command_runner.go:130] > 192.168.49.1	host.minikube.internal
	I1219 06:05:53.284847 2059048 kubeadm.go:884] updating cluster {Name:functional-006924 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APIServerName:minikubeC
A APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false C
ustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1219 06:05:53.284979 2059048 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1219 06:05:53.285048 2059048 ssh_runner.go:195] Run: sudo crictl images --output json
	I1219 06:05:53.307306 2059048 command_runner.go:130] > {
	I1219 06:05:53.307331 2059048 command_runner.go:130] >   "images":  [
	I1219 06:05:53.307335 2059048 command_runner.go:130] >     {
	I1219 06:05:53.307345 2059048 command_runner.go:130] >       "id":  "sha256:b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c",
	I1219 06:05:53.307350 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.307356 2059048 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20250512-df8de77b"
	I1219 06:05:53.307360 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307365 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.307373 2059048 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a"
	I1219 06:05:53.307380 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307385 2059048 command_runner.go:130] >       "size":  "40636774",
	I1219 06:05:53.307391 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.307395 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.307402 2059048 command_runner.go:130] >     },
	I1219 06:05:53.307405 2059048 command_runner.go:130] >     {
	I1219 06:05:53.307417 2059048 command_runner.go:130] >       "id":  "sha256:ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6",
	I1219 06:05:53.307425 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.307431 2059048 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1219 06:05:53.307435 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307441 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.307450 2059048 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"
	I1219 06:05:53.307455 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307460 2059048 command_runner.go:130] >       "size":  "8034419",
	I1219 06:05:53.307463 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.307467 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.307470 2059048 command_runner.go:130] >     },
	I1219 06:05:53.307482 2059048 command_runner.go:130] >     {
	I1219 06:05:53.307492 2059048 command_runner.go:130] >       "id":  "sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf",
	I1219 06:05:53.307496 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.307501 2059048 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.13.1"
	I1219 06:05:53.307505 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307519 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.307528 2059048 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6"
	I1219 06:05:53.307534 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307538 2059048 command_runner.go:130] >       "size":  "21168808",
	I1219 06:05:53.307542 2059048 command_runner.go:130] >       "username":  "nonroot",
	I1219 06:05:53.307546 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.307549 2059048 command_runner.go:130] >     },
	I1219 06:05:53.307552 2059048 command_runner.go:130] >     {
	I1219 06:05:53.307559 2059048 command_runner.go:130] >       "id":  "sha256:271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57",
	I1219 06:05:53.307565 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.307570 2059048 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.6-0"
	I1219 06:05:53.307581 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307585 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.307592 2059048 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890"
	I1219 06:05:53.307598 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307602 2059048 command_runner.go:130] >       "size":  "21749640",
	I1219 06:05:53.307610 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.307614 2059048 command_runner.go:130] >         "value":  "0"
	I1219 06:05:53.307618 2059048 command_runner.go:130] >       },
	I1219 06:05:53.307622 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.307625 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.307631 2059048 command_runner.go:130] >     },
	I1219 06:05:53.307634 2059048 command_runner.go:130] >     {
	I1219 06:05:53.307641 2059048 command_runner.go:130] >       "id":  "sha256:3c6ba27e07aef16adb050828695bfe6206439147b9ade2a2a1777c276bf79a54",
	I1219 06:05:53.307647 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.307653 2059048 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.35.0-rc.1"
	I1219 06:05:53.307666 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307670 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.307682 2059048 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:58367b5c0428495c0c12411fa7a018f5d40fe57307b85d8935b1ed35706ff7ee"
	I1219 06:05:53.307689 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307693 2059048 command_runner.go:130] >       "size":  "24692223",
	I1219 06:05:53.307697 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.307708 2059048 command_runner.go:130] >         "value":  "0"
	I1219 06:05:53.307712 2059048 command_runner.go:130] >       },
	I1219 06:05:53.307716 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.307723 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.307726 2059048 command_runner.go:130] >     },
	I1219 06:05:53.307729 2059048 command_runner.go:130] >     {
	I1219 06:05:53.307736 2059048 command_runner.go:130] >       "id":  "sha256:a34b3483f25ba81aa72f3aeb607a8c756479e8497d8420acbcd2854162ebf84a",
	I1219 06:05:53.307742 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.307748 2059048 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.35.0-rc.1"
	I1219 06:05:53.307753 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307757 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.307765 2059048 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:57ab0f75f58d99f4be7bff7bdda015fcbf1b7c20e58ba2722c8c39f751dc8c98"
	I1219 06:05:53.307769 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307773 2059048 command_runner.go:130] >       "size":  "20672157",
	I1219 06:05:53.307779 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.307783 2059048 command_runner.go:130] >         "value":  "0"
	I1219 06:05:53.307788 2059048 command_runner.go:130] >       },
	I1219 06:05:53.307792 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.307796 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.307799 2059048 command_runner.go:130] >     },
	I1219 06:05:53.307802 2059048 command_runner.go:130] >     {
	I1219 06:05:53.307809 2059048 command_runner.go:130] >       "id":  "sha256:7e3acea3d87aa7ca234514e7f9c10450c7a7f87fc273fc9b5a220e2a2be1ce4e",
	I1219 06:05:53.307813 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.307821 2059048 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.35.0-rc.1"
	I1219 06:05:53.307826 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307830 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.307840 2059048 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:bdd1fa8b53558a2e1967379a36b085c93faf15581e5fa9f212baf679d89c5bb5"
	I1219 06:05:53.307845 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307849 2059048 command_runner.go:130] >       "size":  "22432301",
	I1219 06:05:53.307858 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.307864 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.307867 2059048 command_runner.go:130] >     },
	I1219 06:05:53.307870 2059048 command_runner.go:130] >     {
	I1219 06:05:53.307877 2059048 command_runner.go:130] >       "id":  "sha256:abca4d5226620be2218c3971464a1066651a743008c1db8720353446a4b7bbde",
	I1219 06:05:53.307884 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.307889 2059048 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.35.0-rc.1"
	I1219 06:05:53.307893 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307899 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.307907 2059048 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:8155e3db27c7081abfc8eb5da70820cfeaf0bba7449e45360e8220e670f417d3"
	I1219 06:05:53.307913 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307917 2059048 command_runner.go:130] >       "size":  "15405535",
	I1219 06:05:53.307921 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.307925 2059048 command_runner.go:130] >         "value":  "0"
	I1219 06:05:53.307928 2059048 command_runner.go:130] >       },
	I1219 06:05:53.307932 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.307939 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.307942 2059048 command_runner.go:130] >     },
	I1219 06:05:53.307948 2059048 command_runner.go:130] >     {
	I1219 06:05:53.307955 2059048 command_runner.go:130] >       "id":  "sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1219 06:05:53.307963 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.307967 2059048 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1219 06:05:53.307970 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307974 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.307982 2059048 command_runner.go:130] >         "registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c"
	I1219 06:05:53.307987 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307991 2059048 command_runner.go:130] >       "size":  "267939",
	I1219 06:05:53.307996 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.308000 2059048 command_runner.go:130] >         "value":  "65535"
	I1219 06:05:53.308004 2059048 command_runner.go:130] >       },
	I1219 06:05:53.308011 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.308015 2059048 command_runner.go:130] >       "pinned":  true
	I1219 06:05:53.308020 2059048 command_runner.go:130] >     }
	I1219 06:05:53.308027 2059048 command_runner.go:130] >   ]
	I1219 06:05:53.308030 2059048 command_runner.go:130] > }
	I1219 06:05:53.310449 2059048 containerd.go:627] all images are preloaded for containerd runtime.
	I1219 06:05:53.310472 2059048 containerd.go:534] Images already preloaded, skipping extraction
	I1219 06:05:53.310540 2059048 ssh_runner.go:195] Run: sudo crictl images --output json
	I1219 06:05:53.331271 2059048 command_runner.go:130] > {
	I1219 06:05:53.331288 2059048 command_runner.go:130] >   "images":  [
	I1219 06:05:53.331292 2059048 command_runner.go:130] >     {
	I1219 06:05:53.331304 2059048 command_runner.go:130] >       "id":  "sha256:b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c",
	I1219 06:05:53.331309 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.331314 2059048 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20250512-df8de77b"
	I1219 06:05:53.331318 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331322 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.331332 2059048 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a"
	I1219 06:05:53.331336 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331340 2059048 command_runner.go:130] >       "size":  "40636774",
	I1219 06:05:53.331350 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.331355 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.331358 2059048 command_runner.go:130] >     },
	I1219 06:05:53.331361 2059048 command_runner.go:130] >     {
	I1219 06:05:53.331369 2059048 command_runner.go:130] >       "id":  "sha256:ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6",
	I1219 06:05:53.331373 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.331378 2059048 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1219 06:05:53.331381 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331385 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.331393 2059048 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"
	I1219 06:05:53.331396 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331400 2059048 command_runner.go:130] >       "size":  "8034419",
	I1219 06:05:53.331404 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.331408 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.331411 2059048 command_runner.go:130] >     },
	I1219 06:05:53.331414 2059048 command_runner.go:130] >     {
	I1219 06:05:53.331421 2059048 command_runner.go:130] >       "id":  "sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf",
	I1219 06:05:53.331425 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.331430 2059048 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.13.1"
	I1219 06:05:53.331433 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331439 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.331447 2059048 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6"
	I1219 06:05:53.331451 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331454 2059048 command_runner.go:130] >       "size":  "21168808",
	I1219 06:05:53.331458 2059048 command_runner.go:130] >       "username":  "nonroot",
	I1219 06:05:53.331462 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.331466 2059048 command_runner.go:130] >     },
	I1219 06:05:53.331468 2059048 command_runner.go:130] >     {
	I1219 06:05:53.331475 2059048 command_runner.go:130] >       "id":  "sha256:271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57",
	I1219 06:05:53.331479 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.331484 2059048 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.6-0"
	I1219 06:05:53.331487 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331491 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.331502 2059048 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890"
	I1219 06:05:53.331506 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331510 2059048 command_runner.go:130] >       "size":  "21749640",
	I1219 06:05:53.331515 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.331519 2059048 command_runner.go:130] >         "value":  "0"
	I1219 06:05:53.331522 2059048 command_runner.go:130] >       },
	I1219 06:05:53.331526 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.331530 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.331533 2059048 command_runner.go:130] >     },
	I1219 06:05:53.331536 2059048 command_runner.go:130] >     {
	I1219 06:05:53.331543 2059048 command_runner.go:130] >       "id":  "sha256:3c6ba27e07aef16adb050828695bfe6206439147b9ade2a2a1777c276bf79a54",
	I1219 06:05:53.331547 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.331551 2059048 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.35.0-rc.1"
	I1219 06:05:53.331555 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331559 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.331566 2059048 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:58367b5c0428495c0c12411fa7a018f5d40fe57307b85d8935b1ed35706ff7ee"
	I1219 06:05:53.331569 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331573 2059048 command_runner.go:130] >       "size":  "24692223",
	I1219 06:05:53.331577 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.331585 2059048 command_runner.go:130] >         "value":  "0"
	I1219 06:05:53.331592 2059048 command_runner.go:130] >       },
	I1219 06:05:53.331596 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.331600 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.331603 2059048 command_runner.go:130] >     },
	I1219 06:05:53.331606 2059048 command_runner.go:130] >     {
	I1219 06:05:53.331613 2059048 command_runner.go:130] >       "id":  "sha256:a34b3483f25ba81aa72f3aeb607a8c756479e8497d8420acbcd2854162ebf84a",
	I1219 06:05:53.331617 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.331622 2059048 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.35.0-rc.1"
	I1219 06:05:53.331626 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331629 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.331638 2059048 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:57ab0f75f58d99f4be7bff7bdda015fcbf1b7c20e58ba2722c8c39f751dc8c98"
	I1219 06:05:53.331641 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331645 2059048 command_runner.go:130] >       "size":  "20672157",
	I1219 06:05:53.331652 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.331656 2059048 command_runner.go:130] >         "value":  "0"
	I1219 06:05:53.331659 2059048 command_runner.go:130] >       },
	I1219 06:05:53.331663 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.331666 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.331669 2059048 command_runner.go:130] >     },
	I1219 06:05:53.331672 2059048 command_runner.go:130] >     {
	I1219 06:05:53.331679 2059048 command_runner.go:130] >       "id":  "sha256:7e3acea3d87aa7ca234514e7f9c10450c7a7f87fc273fc9b5a220e2a2be1ce4e",
	I1219 06:05:53.331683 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.331688 2059048 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.35.0-rc.1"
	I1219 06:05:53.331691 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331695 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.331702 2059048 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:bdd1fa8b53558a2e1967379a36b085c93faf15581e5fa9f212baf679d89c5bb5"
	I1219 06:05:53.331705 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331709 2059048 command_runner.go:130] >       "size":  "22432301",
	I1219 06:05:53.331713 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.331717 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.331720 2059048 command_runner.go:130] >     },
	I1219 06:05:53.331723 2059048 command_runner.go:130] >     {
	I1219 06:05:53.331733 2059048 command_runner.go:130] >       "id":  "sha256:abca4d5226620be2218c3971464a1066651a743008c1db8720353446a4b7bbde",
	I1219 06:05:53.331737 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.331742 2059048 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.35.0-rc.1"
	I1219 06:05:53.331745 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331749 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.331757 2059048 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:8155e3db27c7081abfc8eb5da70820cfeaf0bba7449e45360e8220e670f417d3"
	I1219 06:05:53.331760 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331764 2059048 command_runner.go:130] >       "size":  "15405535",
	I1219 06:05:53.331767 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.331771 2059048 command_runner.go:130] >         "value":  "0"
	I1219 06:05:53.331774 2059048 command_runner.go:130] >       },
	I1219 06:05:53.331778 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.331782 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.331785 2059048 command_runner.go:130] >     },
	I1219 06:05:53.331792 2059048 command_runner.go:130] >     {
	I1219 06:05:53.331799 2059048 command_runner.go:130] >       "id":  "sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1219 06:05:53.331803 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.331807 2059048 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1219 06:05:53.331811 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331815 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.331822 2059048 command_runner.go:130] >         "registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c"
	I1219 06:05:53.331826 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331829 2059048 command_runner.go:130] >       "size":  "267939",
	I1219 06:05:53.331833 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.331837 2059048 command_runner.go:130] >         "value":  "65535"
	I1219 06:05:53.331841 2059048 command_runner.go:130] >       },
	I1219 06:05:53.331845 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.331849 2059048 command_runner.go:130] >       "pinned":  true
	I1219 06:05:53.331852 2059048 command_runner.go:130] >     }
	I1219 06:05:53.331855 2059048 command_runner.go:130] >   ]
	I1219 06:05:53.331858 2059048 command_runner.go:130] > }
	I1219 06:05:53.333541 2059048 containerd.go:627] all images are preloaded for containerd runtime.
	I1219 06:05:53.333565 2059048 cache_images.go:86] Images are preloaded, skipping loading
	I1219 06:05:53.333574 2059048 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-rc.1 containerd true true} ...
	I1219 06:05:53.333694 2059048 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-rc.1/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-006924 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1219 06:05:53.333773 2059048 ssh_runner.go:195] Run: sudo crictl --timeout=10s info
	I1219 06:05:53.354890 2059048 command_runner.go:130] > {
	I1219 06:05:53.354909 2059048 command_runner.go:130] >   "cniconfig": {
	I1219 06:05:53.354915 2059048 command_runner.go:130] >     "Networks": [
	I1219 06:05:53.354919 2059048 command_runner.go:130] >       {
	I1219 06:05:53.354926 2059048 command_runner.go:130] >         "Config": {
	I1219 06:05:53.354932 2059048 command_runner.go:130] >           "CNIVersion": "0.3.1",
	I1219 06:05:53.354937 2059048 command_runner.go:130] >           "Name": "cni-loopback",
	I1219 06:05:53.354941 2059048 command_runner.go:130] >           "Plugins": [
	I1219 06:05:53.354945 2059048 command_runner.go:130] >             {
	I1219 06:05:53.354949 2059048 command_runner.go:130] >               "Network": {
	I1219 06:05:53.354953 2059048 command_runner.go:130] >                 "ipam": {},
	I1219 06:05:53.354958 2059048 command_runner.go:130] >                 "type": "loopback"
	I1219 06:05:53.354962 2059048 command_runner.go:130] >               },
	I1219 06:05:53.354967 2059048 command_runner.go:130] >               "Source": "{\"type\":\"loopback\"}"
	I1219 06:05:53.354971 2059048 command_runner.go:130] >             }
	I1219 06:05:53.354975 2059048 command_runner.go:130] >           ],
	I1219 06:05:53.354988 2059048 command_runner.go:130] >           "Source": "{\n\"cniVersion\": \"0.3.1\",\n\"name\": \"cni-loopback\",\n\"plugins\": [{\n  \"type\": \"loopback\"\n}]\n}"
	I1219 06:05:53.354992 2059048 command_runner.go:130] >         },
	I1219 06:05:53.354997 2059048 command_runner.go:130] >         "IFName": "lo"
	I1219 06:05:53.355000 2059048 command_runner.go:130] >       }
	I1219 06:05:53.355003 2059048 command_runner.go:130] >     ],
	I1219 06:05:53.355007 2059048 command_runner.go:130] >     "PluginConfDir": "/etc/cni/net.d",
	I1219 06:05:53.355011 2059048 command_runner.go:130] >     "PluginDirs": [
	I1219 06:05:53.355015 2059048 command_runner.go:130] >       "/opt/cni/bin"
	I1219 06:05:53.355027 2059048 command_runner.go:130] >     ],
	I1219 06:05:53.355031 2059048 command_runner.go:130] >     "PluginMaxConfNum": 1,
	I1219 06:05:53.355036 2059048 command_runner.go:130] >     "Prefix": "eth"
	I1219 06:05:53.355039 2059048 command_runner.go:130] >   },
	I1219 06:05:53.355042 2059048 command_runner.go:130] >   "config": {
	I1219 06:05:53.355046 2059048 command_runner.go:130] >     "cdiSpecDirs": [
	I1219 06:05:53.355050 2059048 command_runner.go:130] >       "/etc/cdi",
	I1219 06:05:53.355059 2059048 command_runner.go:130] >       "/var/run/cdi"
	I1219 06:05:53.355062 2059048 command_runner.go:130] >     ],
	I1219 06:05:53.355066 2059048 command_runner.go:130] >     "cni": {
	I1219 06:05:53.355070 2059048 command_runner.go:130] >       "binDir": "",
	I1219 06:05:53.355073 2059048 command_runner.go:130] >       "binDirs": [
	I1219 06:05:53.355077 2059048 command_runner.go:130] >         "/opt/cni/bin"
	I1219 06:05:53.355080 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.355084 2059048 command_runner.go:130] >       "confDir": "/etc/cni/net.d",
	I1219 06:05:53.355088 2059048 command_runner.go:130] >       "confTemplate": "",
	I1219 06:05:53.355091 2059048 command_runner.go:130] >       "ipPref": "",
	I1219 06:05:53.355095 2059048 command_runner.go:130] >       "maxConfNum": 1,
	I1219 06:05:53.355099 2059048 command_runner.go:130] >       "setupSerially": false,
	I1219 06:05:53.355103 2059048 command_runner.go:130] >       "useInternalLoopback": false
	I1219 06:05:53.355106 2059048 command_runner.go:130] >     },
	I1219 06:05:53.355114 2059048 command_runner.go:130] >     "containerd": {
	I1219 06:05:53.355119 2059048 command_runner.go:130] >       "defaultRuntimeName": "runc",
	I1219 06:05:53.355123 2059048 command_runner.go:130] >       "ignoreBlockIONotEnabledErrors": false,
	I1219 06:05:53.355128 2059048 command_runner.go:130] >       "ignoreRdtNotEnabledErrors": false,
	I1219 06:05:53.355132 2059048 command_runner.go:130] >       "runtimes": {
	I1219 06:05:53.355136 2059048 command_runner.go:130] >         "runc": {
	I1219 06:05:53.355140 2059048 command_runner.go:130] >           "ContainerAnnotations": null,
	I1219 06:05:53.355145 2059048 command_runner.go:130] >           "PodAnnotations": null,
	I1219 06:05:53.355151 2059048 command_runner.go:130] >           "baseRuntimeSpec": "",
	I1219 06:05:53.355155 2059048 command_runner.go:130] >           "cgroupWritable": false,
	I1219 06:05:53.355159 2059048 command_runner.go:130] >           "cniConfDir": "",
	I1219 06:05:53.355163 2059048 command_runner.go:130] >           "cniMaxConfNum": 0,
	I1219 06:05:53.355167 2059048 command_runner.go:130] >           "io_type": "",
	I1219 06:05:53.355171 2059048 command_runner.go:130] >           "options": {
	I1219 06:05:53.355174 2059048 command_runner.go:130] >             "BinaryName": "",
	I1219 06:05:53.355179 2059048 command_runner.go:130] >             "CriuImagePath": "",
	I1219 06:05:53.355183 2059048 command_runner.go:130] >             "CriuWorkPath": "",
	I1219 06:05:53.355187 2059048 command_runner.go:130] >             "IoGid": 0,
	I1219 06:05:53.355190 2059048 command_runner.go:130] >             "IoUid": 0,
	I1219 06:05:53.355198 2059048 command_runner.go:130] >             "NoNewKeyring": false,
	I1219 06:05:53.355201 2059048 command_runner.go:130] >             "Root": "",
	I1219 06:05:53.355205 2059048 command_runner.go:130] >             "ShimCgroup": "",
	I1219 06:05:53.355210 2059048 command_runner.go:130] >             "SystemdCgroup": false
	I1219 06:05:53.355214 2059048 command_runner.go:130] >           },
	I1219 06:05:53.355219 2059048 command_runner.go:130] >           "privileged_without_host_devices": false,
	I1219 06:05:53.355225 2059048 command_runner.go:130] >           "privileged_without_host_devices_all_devices_allowed": false,
	I1219 06:05:53.355229 2059048 command_runner.go:130] >           "runtimePath": "",
	I1219 06:05:53.355233 2059048 command_runner.go:130] >           "runtimeType": "io.containerd.runc.v2",
	I1219 06:05:53.355238 2059048 command_runner.go:130] >           "sandboxer": "podsandbox",
	I1219 06:05:53.355242 2059048 command_runner.go:130] >           "snapshotter": ""
	I1219 06:05:53.355245 2059048 command_runner.go:130] >         }
	I1219 06:05:53.355248 2059048 command_runner.go:130] >       }
	I1219 06:05:53.355252 2059048 command_runner.go:130] >     },
	I1219 06:05:53.355262 2059048 command_runner.go:130] >     "containerdEndpoint": "/run/containerd/containerd.sock",
	I1219 06:05:53.355267 2059048 command_runner.go:130] >     "containerdRootDir": "/var/lib/containerd",
	I1219 06:05:53.355273 2059048 command_runner.go:130] >     "device_ownership_from_security_context": false,
	I1219 06:05:53.355277 2059048 command_runner.go:130] >     "disableApparmor": false,
	I1219 06:05:53.355282 2059048 command_runner.go:130] >     "disableHugetlbController": true,
	I1219 06:05:53.355286 2059048 command_runner.go:130] >     "disableProcMount": false,
	I1219 06:05:53.355290 2059048 command_runner.go:130] >     "drainExecSyncIOTimeout": "0s",
	I1219 06:05:53.355294 2059048 command_runner.go:130] >     "enableCDI": true,
	I1219 06:05:53.355298 2059048 command_runner.go:130] >     "enableSelinux": false,
	I1219 06:05:53.355302 2059048 command_runner.go:130] >     "enableUnprivilegedICMP": true,
	I1219 06:05:53.355306 2059048 command_runner.go:130] >     "enableUnprivilegedPorts": true,
	I1219 06:05:53.355311 2059048 command_runner.go:130] >     "ignoreDeprecationWarnings": null,
	I1219 06:05:53.355319 2059048 command_runner.go:130] >     "ignoreImageDefinedVolumes": false,
	I1219 06:05:53.355323 2059048 command_runner.go:130] >     "maxContainerLogLineSize": 16384,
	I1219 06:05:53.355328 2059048 command_runner.go:130] >     "netnsMountsUnderStateDir": false,
	I1219 06:05:53.355332 2059048 command_runner.go:130] >     "restrictOOMScoreAdj": false,
	I1219 06:05:53.355338 2059048 command_runner.go:130] >     "rootDir": "/var/lib/containerd/io.containerd.grpc.v1.cri",
	I1219 06:05:53.355342 2059048 command_runner.go:130] >     "selinuxCategoryRange": 1024,
	I1219 06:05:53.355347 2059048 command_runner.go:130] >     "stateDir": "/run/containerd/io.containerd.grpc.v1.cri",
	I1219 06:05:53.355357 2059048 command_runner.go:130] >     "tolerateMissingHugetlbController": true,
	I1219 06:05:53.355362 2059048 command_runner.go:130] >     "unsetSeccompProfile": ""
	I1219 06:05:53.355365 2059048 command_runner.go:130] >   },
	I1219 06:05:53.355369 2059048 command_runner.go:130] >   "features": {
	I1219 06:05:53.355373 2059048 command_runner.go:130] >     "supplemental_groups_policy": true
	I1219 06:05:53.355376 2059048 command_runner.go:130] >   },
	I1219 06:05:53.355379 2059048 command_runner.go:130] >   "golang": "go1.24.9",
	I1219 06:05:53.355389 2059048 command_runner.go:130] >   "lastCNILoadStatus": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1219 06:05:53.355399 2059048 command_runner.go:130] >   "lastCNILoadStatus.default": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1219 06:05:53.355402 2059048 command_runner.go:130] >   "runtimeHandlers": [
	I1219 06:05:53.355406 2059048 command_runner.go:130] >     {
	I1219 06:05:53.355409 2059048 command_runner.go:130] >       "features": {
	I1219 06:05:53.355414 2059048 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1219 06:05:53.355418 2059048 command_runner.go:130] >         "user_namespaces": true
	I1219 06:05:53.355421 2059048 command_runner.go:130] >       }
	I1219 06:05:53.355424 2059048 command_runner.go:130] >     },
	I1219 06:05:53.355427 2059048 command_runner.go:130] >     {
	I1219 06:05:53.355431 2059048 command_runner.go:130] >       "features": {
	I1219 06:05:53.355436 2059048 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1219 06:05:53.355440 2059048 command_runner.go:130] >         "user_namespaces": true
	I1219 06:05:53.355443 2059048 command_runner.go:130] >       },
	I1219 06:05:53.355447 2059048 command_runner.go:130] >       "name": "runc"
	I1219 06:05:53.355449 2059048 command_runner.go:130] >     }
	I1219 06:05:53.355452 2059048 command_runner.go:130] >   ],
	I1219 06:05:53.355456 2059048 command_runner.go:130] >   "status": {
	I1219 06:05:53.355460 2059048 command_runner.go:130] >     "conditions": [
	I1219 06:05:53.355463 2059048 command_runner.go:130] >       {
	I1219 06:05:53.355467 2059048 command_runner.go:130] >         "message": "",
	I1219 06:05:53.355471 2059048 command_runner.go:130] >         "reason": "",
	I1219 06:05:53.355475 2059048 command_runner.go:130] >         "status": true,
	I1219 06:05:53.355480 2059048 command_runner.go:130] >         "type": "RuntimeReady"
	I1219 06:05:53.355483 2059048 command_runner.go:130] >       },
	I1219 06:05:53.355486 2059048 command_runner.go:130] >       {
	I1219 06:05:53.355495 2059048 command_runner.go:130] >         "message": "Network plugin returns error: cni plugin not initialized",
	I1219 06:05:53.355500 2059048 command_runner.go:130] >         "reason": "NetworkPluginNotReady",
	I1219 06:05:53.355504 2059048 command_runner.go:130] >         "status": false,
	I1219 06:05:53.355508 2059048 command_runner.go:130] >         "type": "NetworkReady"
	I1219 06:05:53.355512 2059048 command_runner.go:130] >       },
	I1219 06:05:53.355515 2059048 command_runner.go:130] >       {
	I1219 06:05:53.355536 2059048 command_runner.go:130] >         "message": "{\"io.containerd.deprecation/cgroup-v1\":\"The support for cgroup v1 is deprecated since containerd v2.2 and will be removed by no later than May 2029. Upgrade the host to use cgroup v2.\"}",
	I1219 06:05:53.355541 2059048 command_runner.go:130] >         "reason": "ContainerdHasDeprecationWarnings",
	I1219 06:05:53.355547 2059048 command_runner.go:130] >         "status": false,
	I1219 06:05:53.355552 2059048 command_runner.go:130] >         "type": "ContainerdHasNoDeprecationWarnings"
	I1219 06:05:53.355555 2059048 command_runner.go:130] >       }
	I1219 06:05:53.355557 2059048 command_runner.go:130] >     ]
	I1219 06:05:53.355560 2059048 command_runner.go:130] >   }
	I1219 06:05:53.355563 2059048 command_runner.go:130] > }
	I1219 06:05:53.357747 2059048 cni.go:84] Creating CNI manager for ""
	I1219 06:05:53.357770 2059048 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1219 06:05:53.357795 2059048 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1219 06:05:53.357824 2059048 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-rc.1 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-006924 NodeName:functional-006924 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt Stati
cPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1219 06:05:53.357938 2059048 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-006924"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-rc.1
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1219 06:05:53.358021 2059048 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-rc.1
	I1219 06:05:53.365051 2059048 command_runner.go:130] > kubeadm
	I1219 06:05:53.365070 2059048 command_runner.go:130] > kubectl
	I1219 06:05:53.365074 2059048 command_runner.go:130] > kubelet
	I1219 06:05:53.366033 2059048 binaries.go:51] Found k8s binaries, skipping transfer
	I1219 06:05:53.366118 2059048 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1219 06:05:53.373810 2059048 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (326 bytes)
	I1219 06:05:53.386231 2059048 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (357 bytes)
	I1219 06:05:53.399156 2059048 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2235 bytes)
	I1219 06:05:53.411832 2059048 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1219 06:05:53.415476 2059048 command_runner.go:130] > 192.168.49.2	control-plane.minikube.internal
	I1219 06:05:53.415580 2059048 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1219 06:05:53.524736 2059048 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1219 06:05:53.900522 2059048 certs.go:69] Setting up /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924 for IP: 192.168.49.2
	I1219 06:05:53.900547 2059048 certs.go:195] generating shared ca certs ...
	I1219 06:05:53.900563 2059048 certs.go:227] acquiring lock for ca certs: {Name:mk382c71693ea4061363f97b153b21bf6cdf5f38 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1219 06:05:53.900702 2059048 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.key
	I1219 06:05:53.900780 2059048 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/proxy-client-ca.key
	I1219 06:05:53.900803 2059048 certs.go:257] generating profile certs ...
	I1219 06:05:53.900908 2059048 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.key
	I1219 06:05:53.900976 2059048 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.key.febe6fed
	I1219 06:05:53.901024 2059048 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/proxy-client.key
	I1219 06:05:53.901037 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I1219 06:05:53.901081 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I1219 06:05:53.901098 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I1219 06:05:53.901109 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I1219 06:05:53.901127 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I1219 06:05:53.901139 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I1219 06:05:53.901154 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I1219 06:05:53.901171 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I1219 06:05:53.901229 2059048 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/2000386.pem (1338 bytes)
	W1219 06:05:53.901264 2059048 certs.go:480] ignoring /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/2000386_empty.pem, impossibly tiny 0 bytes
	I1219 06:05:53.901277 2059048 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca-key.pem (1679 bytes)
	I1219 06:05:53.901306 2059048 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem (1078 bytes)
	I1219 06:05:53.901333 2059048 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/cert.pem (1123 bytes)
	I1219 06:05:53.901365 2059048 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/key.pem (1671 bytes)
	I1219 06:05:53.901418 2059048 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem (1708 bytes)
	I1219 06:05:53.901449 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:05:53.901465 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/2000386.pem -> /usr/share/ca-certificates/2000386.pem
	I1219 06:05:53.901481 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem -> /usr/share/ca-certificates/20003862.pem
	I1219 06:05:53.902039 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1219 06:05:53.926748 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1219 06:05:53.945718 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1219 06:05:53.964111 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I1219 06:05:53.984388 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1219 06:05:54.005796 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I1219 06:05:54.027058 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1219 06:05:54.045330 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I1219 06:05:54.062681 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1219 06:05:54.080390 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/2000386.pem --> /usr/share/ca-certificates/2000386.pem (1338 bytes)
	I1219 06:05:54.102399 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem --> /usr/share/ca-certificates/20003862.pem (1708 bytes)
	I1219 06:05:54.120580 2059048 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (722 bytes)
	I1219 06:05:54.133732 2059048 ssh_runner.go:195] Run: openssl version
	I1219 06:05:54.139799 2059048 command_runner.go:130] > OpenSSL 3.0.17 1 Jul 2025 (Library: OpenSSL 3.0.17 1 Jul 2025)
	I1219 06:05:54.140191 2059048 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/20003862.pem
	I1219 06:05:54.147812 2059048 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/20003862.pem /etc/ssl/certs/20003862.pem
	I1219 06:05:54.155315 2059048 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/20003862.pem
	I1219 06:05:54.159037 2059048 command_runner.go:130] > -rw-r--r-- 1 root root 1708 Dec 19 05:57 /usr/share/ca-certificates/20003862.pem
	I1219 06:05:54.159108 2059048 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 19 05:57 /usr/share/ca-certificates/20003862.pem
	I1219 06:05:54.159165 2059048 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/20003862.pem
	I1219 06:05:54.200029 2059048 command_runner.go:130] > 3ec20f2e
	I1219 06:05:54.200546 2059048 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1219 06:05:54.208733 2059048 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:05:54.216254 2059048 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1219 06:05:54.224240 2059048 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:05:54.228059 2059048 command_runner.go:130] > -rw-r--r-- 1 root root 1111 Dec 19 05:43 /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:05:54.228165 2059048 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 19 05:43 /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:05:54.228244 2059048 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:05:54.268794 2059048 command_runner.go:130] > b5213941
	I1219 06:05:54.269372 2059048 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1219 06:05:54.277054 2059048 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/2000386.pem
	I1219 06:05:54.284467 2059048 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/2000386.pem /etc/ssl/certs/2000386.pem
	I1219 06:05:54.291949 2059048 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/2000386.pem
	I1219 06:05:54.295750 2059048 command_runner.go:130] > -rw-r--r-- 1 root root 1338 Dec 19 05:57 /usr/share/ca-certificates/2000386.pem
	I1219 06:05:54.295798 2059048 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 19 05:57 /usr/share/ca-certificates/2000386.pem
	I1219 06:05:54.295849 2059048 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/2000386.pem
	I1219 06:05:54.341163 2059048 command_runner.go:130] > 51391683
	I1219 06:05:54.341782 2059048 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1219 06:05:54.349497 2059048 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1219 06:05:54.353229 2059048 command_runner.go:130] >   File: /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1219 06:05:54.353253 2059048 command_runner.go:130] >   Size: 1176      	Blocks: 8          IO Block: 4096   regular file
	I1219 06:05:54.353261 2059048 command_runner.go:130] > Device: 259,1	Inode: 1582667     Links: 1
	I1219 06:05:54.353268 2059048 command_runner.go:130] > Access: (0644/-rw-r--r--)  Uid: (    0/    root)   Gid: (    0/    root)
	I1219 06:05:54.353275 2059048 command_runner.go:130] > Access: 2025-12-19 06:01:47.245300782 +0000
	I1219 06:05:54.353281 2059048 command_runner.go:130] > Modify: 2025-12-19 05:57:42.198721757 +0000
	I1219 06:05:54.353286 2059048 command_runner.go:130] > Change: 2025-12-19 05:57:42.198721757 +0000
	I1219 06:05:54.353294 2059048 command_runner.go:130] >  Birth: 2025-12-19 05:57:42.198721757 +0000
	I1219 06:05:54.353372 2059048 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1219 06:05:54.398897 2059048 command_runner.go:130] > Certificate will not expire
	I1219 06:05:54.399374 2059048 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1219 06:05:54.440111 2059048 command_runner.go:130] > Certificate will not expire
	I1219 06:05:54.440565 2059048 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1219 06:05:54.481409 2059048 command_runner.go:130] > Certificate will not expire
	I1219 06:05:54.481968 2059048 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1219 06:05:54.522576 2059048 command_runner.go:130] > Certificate will not expire
	I1219 06:05:54.523020 2059048 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1219 06:05:54.563365 2059048 command_runner.go:130] > Certificate will not expire
	I1219 06:05:54.563892 2059048 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1219 06:05:54.604428 2059048 command_runner.go:130] > Certificate will not expire
	I1219 06:05:54.604920 2059048 kubeadm.go:401] StartCluster: {Name:functional-006924 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APIServerName:minikubeCA A
PIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false Cust
omQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1219 06:05:54.605002 2059048 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1219 06:05:54.605063 2059048 ssh_runner.go:195] Run: sudo -s eval "crictl --timeout=10s ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1219 06:05:54.631433 2059048 cri.go:92] found id: ""
	I1219 06:05:54.631512 2059048 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1219 06:05:54.638289 2059048 command_runner.go:130] > /var/lib/kubelet/config.yaml
	I1219 06:05:54.638353 2059048 command_runner.go:130] > /var/lib/kubelet/kubeadm-flags.env
	I1219 06:05:54.638374 2059048 command_runner.go:130] > /var/lib/minikube/etcd:
	I1219 06:05:54.639191 2059048 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1219 06:05:54.639207 2059048 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1219 06:05:54.639278 2059048 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1219 06:05:54.646289 2059048 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1219 06:05:54.646704 2059048 kubeconfig.go:47] verify endpoint returned: get endpoint: "functional-006924" does not appear in /home/jenkins/minikube-integration/22230-1998525/kubeconfig
	I1219 06:05:54.646809 2059048 kubeconfig.go:62] /home/jenkins/minikube-integration/22230-1998525/kubeconfig needs updating (will repair): [kubeconfig missing "functional-006924" cluster setting kubeconfig missing "functional-006924" context setting]
	I1219 06:05:54.647118 2059048 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22230-1998525/kubeconfig: {Name:mk7db1732c7d76f01100426cb283dc7515a3b9ea Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1219 06:05:54.647542 2059048 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/22230-1998525/kubeconfig
	I1219 06:05:54.647700 2059048 kapi.go:59] client config for functional-006924: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.crt", KeyFile:"/home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.key", CAFile:"/home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(ni
l), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1ffe230), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1219 06:05:54.648289 2059048 envvar.go:172] "Feature gate default state" feature="ClientsPreferCBOR" enabled=false
	I1219 06:05:54.648312 2059048 envvar.go:172] "Feature gate default state" feature="InOrderInformers" enabled=true
	I1219 06:05:54.648318 2059048 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I1219 06:05:54.648377 2059048 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I1219 06:05:54.648389 2059048 envvar.go:172] "Feature gate default state" feature="ClientsAllowCBOR" enabled=false
	I1219 06:05:54.648357 2059048 cert_rotation.go:141] "Starting client certificate rotation controller" logger="tls-transport-cache"
	I1219 06:05:54.648779 2059048 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1219 06:05:54.659696 2059048 kubeadm.go:635] The running cluster does not require reconfiguration: 192.168.49.2
	I1219 06:05:54.659739 2059048 kubeadm.go:602] duration metric: took 20.517186ms to restartPrimaryControlPlane
	I1219 06:05:54.659750 2059048 kubeadm.go:403] duration metric: took 54.838405ms to StartCluster
	I1219 06:05:54.659766 2059048 settings.go:142] acquiring lock: {Name:mk0fb518a1861caea9ce90c087e9f98ff93c6842 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1219 06:05:54.659859 2059048 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/22230-1998525/kubeconfig
	I1219 06:05:54.660602 2059048 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22230-1998525/kubeconfig: {Name:mk7db1732c7d76f01100426cb283dc7515a3b9ea Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1219 06:05:54.660878 2059048 start.go:236] Will wait 6m0s for node &{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I1219 06:05:54.661080 2059048 config.go:182] Loaded profile config "functional-006924": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1219 06:05:54.661197 2059048 addons.go:543] enable addons start: toEnable=map[ambassador:false amd-gpu-device-plugin:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I1219 06:05:54.661465 2059048 addons.go:70] Setting storage-provisioner=true in profile "functional-006924"
	I1219 06:05:54.661481 2059048 addons.go:239] Setting addon storage-provisioner=true in "functional-006924"
	I1219 06:05:54.661506 2059048 host.go:66] Checking if "functional-006924" exists ...
	I1219 06:05:54.661954 2059048 cli_runner.go:164] Run: docker container inspect functional-006924 --format={{.State.Status}}
	I1219 06:05:54.662128 2059048 addons.go:70] Setting default-storageclass=true in profile "functional-006924"
	I1219 06:05:54.662158 2059048 addons_storage_classes.go:34] enableOrDisableStorageClasses default-storageclass=true on "functional-006924"
	I1219 06:05:54.662427 2059048 cli_runner.go:164] Run: docker container inspect functional-006924 --format={{.State.Status}}
	I1219 06:05:54.667300 2059048 out.go:179] * Verifying Kubernetes components...
	I1219 06:05:54.673650 2059048 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1219 06:05:54.689683 2059048 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/22230-1998525/kubeconfig
	I1219 06:05:54.689848 2059048 kapi.go:59] client config for functional-006924: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.crt", KeyFile:"/home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.key", CAFile:"/home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(ni
l), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1ffe230), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1219 06:05:54.690123 2059048 addons.go:239] Setting addon default-storageclass=true in "functional-006924"
	I1219 06:05:54.690152 2059048 host.go:66] Checking if "functional-006924" exists ...
	I1219 06:05:54.690560 2059048 cli_runner.go:164] Run: docker container inspect functional-006924 --format={{.State.Status}}
	I1219 06:05:54.715008 2059048 out.go:179]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I1219 06:05:54.717850 2059048 addons.go:436] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:05:54.717879 2059048 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I1219 06:05:54.717946 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:54.734767 2059048 addons.go:436] installing /etc/kubernetes/addons/storageclass.yaml
	I1219 06:05:54.734788 2059048 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I1219 06:05:54.734856 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:54.764236 2059048 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:05:54.773070 2059048 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:05:54.876977 2059048 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1219 06:05:54.898675 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:05:54.923995 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:05:55.652544 2059048 node_ready.go:35] waiting up to 6m0s for node "functional-006924" to be "Ready" ...
	I1219 06:05:55.652680 2059048 type.go:165] "Request Body" body=""
	I1219 06:05:55.652777 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:05:55.653088 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:55.653133 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:55.653174 2059048 retry.go:31] will retry after 152.748ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:55.653242 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:55.653274 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:55.653290 2059048 retry.go:31] will retry after 222.401366ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:55.653368 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:05:55.806850 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:05:55.871164 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:55.871241 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:55.871268 2059048 retry.go:31] will retry after 248.166368ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:55.876351 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:05:55.932419 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:55.936105 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:55.936137 2059048 retry.go:31] will retry after 191.546131ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:56.120512 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:05:56.128049 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:05:56.153420 2059048 type.go:165] "Request Body" body=""
	I1219 06:05:56.153544 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:05:56.153844 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:05:56.188805 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:56.192400 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:56.192475 2059048 retry.go:31] will retry after 421.141509ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:56.203130 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:56.203228 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:56.203252 2059048 retry.go:31] will retry after 495.708783ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:56.614800 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:05:56.653236 2059048 type.go:165] "Request Body" body=""
	I1219 06:05:56.653361 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:05:56.653708 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:05:56.677894 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:56.677943 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:56.677993 2059048 retry.go:31] will retry after 980.857907ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:56.700099 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:05:56.755124 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:56.758623 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:56.758652 2059048 retry.go:31] will retry after 1.143622688s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:57.152911 2059048 type.go:165] "Request Body" body=""
	I1219 06:05:57.153042 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:05:57.153399 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:05:57.652868 2059048 type.go:165] "Request Body" body=""
	I1219 06:05:57.652947 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:05:57.653291 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:05:57.653378 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:05:57.659518 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:05:57.724667 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:57.724716 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:57.724735 2059048 retry.go:31] will retry after 900.329628ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:57.903067 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:05:57.986230 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:57.986314 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:57.986340 2059048 retry.go:31] will retry after 1.7845791s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:58.153671 2059048 type.go:165] "Request Body" body=""
	I1219 06:05:58.153749 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:05:58.154120 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:05:58.625732 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:05:58.653113 2059048 type.go:165] "Request Body" body=""
	I1219 06:05:58.653187 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:05:58.653475 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:05:58.712944 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:58.713042 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:58.713071 2059048 retry.go:31] will retry after 2.322946675s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:59.153740 2059048 type.go:165] "Request Body" body=""
	I1219 06:05:59.153822 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:05:59.154186 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:05:59.652843 2059048 type.go:165] "Request Body" body=""
	I1219 06:05:59.652927 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:05:59.653311 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:05:59.771577 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:05:59.835749 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:59.839442 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:59.839476 2059048 retry.go:31] will retry after 2.412907222s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:00.152821 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:00.152949 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:00.153306 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:00.153393 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:00.653320 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:00.653404 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:00.653734 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:01.036322 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:06:01.102362 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:01.106179 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:01.106214 2059048 retry.go:31] will retry after 2.139899672s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:01.153490 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:01.153572 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:01.153855 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:01.653656 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:01.653732 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:01.654026 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:02.152793 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:02.152870 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:02.153204 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:02.252582 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:06:02.312437 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:02.312479 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:02.312500 2059048 retry.go:31] will retry after 1.566668648s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:02.652881 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:02.652958 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:02.653230 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:02.653283 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:03.152957 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:03.153054 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:03.153393 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:03.246844 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:06:03.302237 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:03.305728 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:03.305771 2059048 retry.go:31] will retry after 6.170177016s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:03.653408 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:03.653482 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:03.653834 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:03.880237 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:06:03.939688 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:03.939736 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:03.939756 2059048 retry.go:31] will retry after 4.919693289s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:04.153025 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:04.153101 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:04.153368 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:04.653333 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:04.653405 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:04.653716 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:04.653762 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:05.153589 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:05.153680 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:05.154012 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:05.652866 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:05.652946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:05.653271 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:06.152846 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:06.152922 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:06.153277 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:06.652913 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:06.652987 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:06.653350 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:07.152875 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:07.152947 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:07.153248 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:07.153305 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:07.652846 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:07.652924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:07.653261 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:08.152864 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:08.152938 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:08.153236 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:08.652856 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:08.652924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:08.653179 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:08.859603 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:06:08.923746 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:08.923802 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:08.923824 2059048 retry.go:31] will retry after 7.49455239s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:09.153273 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:09.153361 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:09.153733 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:09.153794 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:09.476166 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:06:09.536340 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:09.536378 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:09.536397 2059048 retry.go:31] will retry after 3.264542795s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:09.652787 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:09.652863 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:09.653191 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:10.152879 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:10.152955 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:10.153217 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:10.653092 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:10.653172 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:10.653505 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:11.153189 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:11.153267 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:11.153564 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:11.653360 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:11.653432 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:11.653748 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:11.653809 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:12.153584 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:12.153667 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:12.154066 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:12.652816 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:12.652897 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:12.653232 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:12.801732 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:06:12.858668 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:12.858722 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:12.858742 2059048 retry.go:31] will retry after 7.015856992s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:13.152872 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:13.152946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:13.153207 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:13.652838 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:13.652915 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:13.653206 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:14.152876 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:14.152957 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:14.153277 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:14.153340 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:14.653224 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:14.653299 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:14.653566 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:15.153381 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:15.153458 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:15.153856 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:15.653715 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:15.653796 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:15.654137 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:16.153469 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:16.153543 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:16.153826 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:16.153868 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:16.419404 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:06:16.476671 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:16.480081 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:16.480119 2059048 retry.go:31] will retry after 7.9937579s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:16.653575 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:16.653716 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:16.653985 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:17.153751 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:17.153850 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:17.154134 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:17.652872 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:17.652939 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:17.653201 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:18.152893 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:18.152976 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:18.153301 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:18.652860 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:18.652932 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:18.653233 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:18.653289 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:19.152869 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:19.152958 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:19.153227 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:19.652934 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:19.653010 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:19.653354 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:19.875781 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:06:19.950537 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:19.954067 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:19.954097 2059048 retry.go:31] will retry after 12.496952157s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:20.153579 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:20.153656 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:20.154027 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:20.652751 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:20.652846 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:20.653178 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:21.153037 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:21.153112 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:21.153446 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:21.153504 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:21.652852 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:21.652937 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:21.653238 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:22.152882 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:22.152951 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:22.153207 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:22.652851 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:22.652925 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:22.653261 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:23.152820 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:23.152899 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:23.153240 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:23.652818 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:23.652892 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:23.653158 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:23.653200 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:24.152783 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:24.152857 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:24.153198 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:24.474774 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:06:24.538538 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:24.538585 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:24.538605 2059048 retry.go:31] will retry after 14.635173495s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:24.653139 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:24.653215 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:24.653538 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:25.153284 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:25.153354 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:25.153661 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:25.653607 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:25.653689 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:25.653986 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:25.654040 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:26.152728 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:26.152857 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:26.153186 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:26.652777 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:26.652852 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:26.653175 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:27.152839 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:27.152916 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:27.153210 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:27.652846 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:27.652930 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:27.653284 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:28.152874 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:28.152956 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:28.153232 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:28.153286 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:28.652849 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:28.652924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:28.653242 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:29.152838 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:29.152930 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:29.153299 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:29.652870 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:29.652945 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:29.653242 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:30.152853 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:30.152937 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:30.153290 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:30.153348 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:30.653022 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:30.653115 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:30.653416 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:31.152873 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:31.152960 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:31.153275 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:31.652985 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:31.653057 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:31.653405 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:32.152845 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:32.152921 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:32.153253 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:32.451758 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:06:32.506473 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:32.509966 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:32.509998 2059048 retry.go:31] will retry after 31.028140902s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:32.653234 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:32.653309 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:32.653632 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:32.653749 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:33.153497 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:33.153583 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:33.153949 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:33.652732 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:33.652832 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:33.653182 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:34.152891 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:34.152964 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:34.153263 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:34.653098 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:34.653173 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:34.653525 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:35.153363 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:35.153489 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:35.153845 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:35.153907 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:35.653568 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:35.653649 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:35.653928 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:36.153725 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:36.153800 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:36.154115 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:36.652850 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:36.652933 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:36.653274 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:37.153419 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:37.153492 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:37.153866 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:37.153952 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:37.653726 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:37.653797 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:37.654143 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:38.152858 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:38.152933 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:38.153263 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:38.652866 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:38.652935 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:38.653195 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:39.152856 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:39.152932 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:39.153265 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:39.174643 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:06:39.239291 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:39.239335 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:39.239354 2059048 retry.go:31] will retry after 15.420333699s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:39.652870 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:39.652944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:39.653261 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:39.653316 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:40.152873 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:40.152963 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:40.153285 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:40.653056 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:40.653131 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:40.653494 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:41.153188 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:41.153263 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:41.153588 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:41.652865 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:41.652932 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:41.653248 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:42.152875 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:42.152964 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:42.153379 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:42.153461 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:42.652919 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:42.653000 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:42.653314 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:43.152873 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:43.152946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:43.153240 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:43.652956 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:43.653027 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:43.653379 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:44.152963 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:44.153044 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:44.153381 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:44.653201 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:44.653284 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:44.653550 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:44.653592 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:45.153794 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:45.153882 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:45.154325 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:45.653107 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:45.653190 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:45.653497 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:46.152858 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:46.152925 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:46.153184 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:46.652851 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:46.652926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:46.653256 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:47.152863 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:47.152944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:47.153246 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:47.153293 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:47.652875 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:47.652964 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:47.653331 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:48.152820 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:48.152904 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:48.153254 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:48.652976 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:48.653054 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:48.653401 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:49.152928 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:49.153003 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:49.153264 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:49.652842 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:49.652922 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:49.653266 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:49.653325 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:50.152835 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:50.152916 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:50.153230 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:50.653021 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:50.653097 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:50.653360 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:51.152819 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:51.152892 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:51.153216 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:51.652938 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:51.653021 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:51.653350 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:51.653404 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:52.152878 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:52.152997 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:52.153340 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:52.653054 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:52.653126 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:52.653428 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:53.152809 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:53.152885 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:53.153202 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:53.652872 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:53.652946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:53.653212 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:54.152921 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:54.153000 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:54.153307 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:54.153361 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:54.653430 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:54.653504 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:54.653886 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:54.660097 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:06:54.724740 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:54.724806 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:54.724824 2059048 retry.go:31] will retry after 21.489743806s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:55.153047 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:55.153170 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:55.153542 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:55.653137 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:55.653210 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:55.653500 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:56.153216 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:56.153285 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:56.153620 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:56.153682 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:56.653420 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:56.653501 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:56.653832 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:57.153605 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:57.153702 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:57.154020 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:57.652746 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:57.652846 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:57.653184 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:58.152882 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:58.152958 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:58.153278 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:58.652843 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:58.652924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:58.653216 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:58.653262 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:59.152798 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:59.152874 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:59.153218 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:59.652867 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:59.652934 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:59.653193 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:00.155125 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:00.155210 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:00.156183 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:00.653343 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:00.653416 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:00.653737 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:00.653787 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:01.152970 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:01.153062 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:01.153389 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:01.652835 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:01.652908 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:01.653245 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:02.152861 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:02.152952 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:02.153330 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:02.652900 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:02.652986 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:02.653368 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:03.152826 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:03.152908 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:03.153251 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:03.153306 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:03.538820 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:07:03.598261 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:07:03.602187 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:07:03.602221 2059048 retry.go:31] will retry after 27.693032791s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:07:03.653410 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:03.653486 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:03.653840 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:04.153298 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:04.153371 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:04.153670 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:04.653539 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:04.653618 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:04.653956 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:05.153749 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:05.153837 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:05.154160 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:05.154219 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:05.653149 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:05.653217 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:05.653546 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:06.153378 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:06.153468 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:06.153799 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:06.653420 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:06.653494 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:06.653803 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:07.153113 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:07.153187 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:07.153451 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:07.652897 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:07.652979 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:07.653292 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:07.653351 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:08.152825 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:08.152901 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:08.153274 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:08.652859 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:08.652926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:08.653238 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:09.153667 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:09.153756 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:09.154076 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:09.652824 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:09.652899 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:09.653232 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:10.153341 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:10.153410 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:10.153757 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:10.153818 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:10.653710 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:10.653802 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:10.654164 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:11.152777 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:11.152862 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:11.153199 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:11.652877 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:11.652946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:11.653219 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:12.152831 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:12.152908 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:12.153224 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:12.652832 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:12.652911 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:12.653226 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:12.653273 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:13.152877 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:13.152951 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:13.153279 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:13.652822 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:13.652904 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:13.653241 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:14.152814 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:14.152897 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:14.153218 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:14.653177 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:14.653250 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:14.653558 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:14.653611 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:15.153356 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:15.153436 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:15.153788 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:15.652725 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:15.652816 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:15.653161 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:16.152875 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:16.152948 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:16.153206 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:16.215537 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:07:16.273841 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:07:16.273881 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:07:16.273899 2059048 retry.go:31] will retry after 30.872906877s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:07:16.653514 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:16.653598 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:16.653919 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:16.653970 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:17.153579 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:17.153656 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:17.153994 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:17.653597 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:17.653665 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:17.653945 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:18.152782 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:18.152859 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:18.153155 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:18.652836 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:18.652926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:18.653269 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:19.152861 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:19.152939 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:19.153239 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:19.153292 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:19.652841 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:19.652916 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:19.653250 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:20.152836 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:20.152910 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:20.153258 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:20.653266 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:20.653354 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:20.653711 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:21.153511 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:21.153585 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:21.153886 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:21.153948 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:21.653690 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:21.653776 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:21.654081 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:22.153312 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:22.153387 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:22.153749 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:22.653581 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:22.653661 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:22.654117 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:23.152715 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:23.152802 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:23.153141 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:23.652867 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:23.652936 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:23.653196 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:23.653241 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:24.152848 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:24.152934 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:24.153277 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:24.653127 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:24.653203 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:24.653560 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:25.153321 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:25.153393 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:25.153662 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:25.652744 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:25.652855 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:25.653239 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:25.653298 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:26.152968 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:26.153049 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:26.153397 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:26.652880 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:26.652963 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:26.653275 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:27.152811 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:27.152888 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:27.153207 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:27.652936 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:27.653013 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:27.653346 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:27.653407 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:28.152886 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:28.152961 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:28.153298 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:28.652829 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:28.652904 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:28.653240 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:29.152818 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:29.152890 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:29.153229 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:29.652862 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:29.652947 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:29.653200 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:30.152832 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:30.152918 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:30.153263 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:30.153321 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:30.652996 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:30.653069 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:30.653387 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:31.152872 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:31.152950 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:31.153277 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:31.295743 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:07:31.365905 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:07:31.365953 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:07:31.366070 2059048 out.go:285] ! Enabling 'default-storageclass' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1219 06:07:31.653338 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:31.653413 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:31.653757 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:32.153415 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:32.153519 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:32.153862 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:32.153934 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:32.653181 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:32.653249 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:32.653512 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:33.152815 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:33.152899 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:33.153193 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:33.652817 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:33.652892 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:33.653195 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:34.152859 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:34.152941 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:34.153251 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:34.653155 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:34.653231 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:34.653574 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:34.653631 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:35.153386 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:35.153461 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:35.153800 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:35.652767 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:35.652837 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:35.653104 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:36.152871 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:36.152949 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:36.153263 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:36.652848 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:36.652927 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:36.653291 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:37.152893 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:37.152978 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:37.153238 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:37.153278 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:37.652927 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:37.653002 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:37.653295 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:38.152985 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:38.153059 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:38.153404 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:38.652889 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:38.652964 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:38.653220 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:39.152836 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:39.152920 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:39.153233 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:39.153294 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:39.652812 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:39.652889 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:39.653215 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:40.152857 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:40.152932 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:40.153187 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:40.653073 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:40.653148 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:40.653479 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:41.152804 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:41.152885 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:41.153180 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:41.652771 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:41.652841 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:41.653154 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:41.653206 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:42.152884 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:42.152963 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:42.153327 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:42.652853 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:42.652951 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:42.653271 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:43.152861 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:43.152931 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:43.153232 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:43.652842 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:43.652918 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:43.653242 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:43.653314 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:44.152994 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:44.153073 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:44.153402 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:44.653413 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:44.653502 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:44.653799 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:45.153668 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:45.153801 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:45.154199 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:45.653080 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:45.653158 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:45.653481 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:45.653538 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:46.153253 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:46.153372 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:46.153620 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:46.653410 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:46.653505 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:46.653901 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:47.147624 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:07:47.153238 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:47.153313 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:47.153618 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:47.207245 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:07:47.207289 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:07:47.207381 2059048 out.go:285] ! Enabling 'storage-provisioner' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1219 06:07:47.212201 2059048 out.go:179] * Enabled addons: 
	I1219 06:07:47.215092 2059048 addons.go:546] duration metric: took 1m52.553895373s for enable addons: enabled=[]
	I1219 06:07:47.652840 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:47.652917 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:47.653177 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:48.152861 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:48.152936 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:48.153274 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:48.153336 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:48.652862 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:48.652941 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:48.653266 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:49.152877 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:49.152950 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:49.153222 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:49.652844 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:49.652940 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:49.653312 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:50.152823 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:50.152906 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:50.153448 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:50.153518 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:50.653031 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:50.653101 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:50.653362 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:51.153113 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:51.153194 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:51.153608 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:51.653414 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:51.653487 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:51.653829 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:52.153249 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:52.153337 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:52.153602 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:52.153645 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:52.653349 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:52.653422 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:52.653735 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:53.153542 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:53.153620 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:53.153960 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:53.653712 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:53.653793 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:53.654088 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:54.153153 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:54.153246 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:54.153650 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:54.153714 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:54.653597 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:54.653675 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:54.654059 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:55.153390 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:55.153470 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:55.153780 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:55.652892 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:55.652968 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:55.653343 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:56.153064 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:56.153144 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:56.153504 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:56.652934 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:56.653001 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:56.653305 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:56.653374 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:57.152823 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:57.152898 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:57.153265 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:57.652979 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:57.653054 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:57.653394 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:58.152861 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:58.152935 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:58.153224 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:58.652841 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:58.652916 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:58.653243 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:59.152863 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:59.152942 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:59.153252 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:59.153305 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:59.652871 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:59.652957 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:59.653221 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:00.152926 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:00.153011 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:00.153341 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:00.653583 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:00.653664 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:00.654050 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:01.153430 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:01.153505 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:01.153842 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:01.153899 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:01.653658 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:01.653734 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:01.654077 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:02.152810 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:02.152894 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:02.153236 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:02.652906 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:02.652992 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:02.653255 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:03.152814 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:03.152911 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:03.153241 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:03.652846 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:03.652946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:03.653311 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:03.653372 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:04.152873 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:04.152947 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:04.153272 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:04.653311 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:04.653393 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:04.653691 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:05.153373 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:05.153449 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:05.153786 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:05.653505 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:05.653577 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:05.653866 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:05.653911 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:06.153684 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:06.153763 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:06.154116 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:06.652849 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:06.652928 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:06.653265 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:07.152801 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:07.152875 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:07.153140 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:07.652821 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:07.652904 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:07.653239 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:08.152989 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:08.153069 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:08.153365 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:08.153412 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:08.652920 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:08.652987 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:08.653255 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:09.152797 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:09.152875 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:09.153215 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:09.652928 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:09.653026 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:09.653367 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:10.152861 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:10.152928 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:10.153183 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:10.653055 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:10.653153 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:10.653481 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:10.653536 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:11.152864 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:11.152945 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:11.153256 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:11.652878 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:11.652952 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:11.653279 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:12.152810 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:12.152885 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:12.153241 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:12.652960 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:12.653057 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:12.653342 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:13.152864 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:13.152943 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:13.153205 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:13.153250 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:13.652804 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:13.652895 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:13.653245 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:14.152963 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:14.153048 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:14.153407 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:14.653382 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:14.653457 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:14.653810 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:15.153570 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:15.153650 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:15.153993 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:15.154055 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:15.652797 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:15.652875 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:15.653205 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:16.152855 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:16.152927 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:16.153181 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:16.652842 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:16.652938 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:16.653237 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:17.152851 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:17.152930 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:17.153255 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:17.652875 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:17.652952 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:17.653273 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:17.653327 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:18.152822 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:18.152897 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:18.153211 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:18.652850 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:18.652929 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:18.653226 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:19.152862 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:19.152939 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:19.153206 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:19.652841 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:19.652943 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:19.653281 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:20.152968 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:20.153059 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:20.153353 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:20.153402 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:20.652914 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:20.652990 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:20.653265 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:21.152823 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:21.152899 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:21.153199 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:21.652903 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:21.652980 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:21.653286 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:22.152864 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:22.152936 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:22.153212 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:22.652848 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:22.652926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:22.653257 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:22.653311 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:23.152987 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:23.153082 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:23.153450 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:23.652860 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:23.652933 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:23.653264 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:24.152875 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:24.152959 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:24.153259 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:24.653188 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:24.653259 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:24.653621 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:24.653676 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:25.152855 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:25.152925 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:25.153189 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:25.653096 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:25.653176 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:25.653514 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:26.153303 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:26.153380 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:26.153718 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:26.653504 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:26.653583 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:26.653866 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:26.653917 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:27.153641 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:27.153723 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:27.154070 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:27.653768 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:27.653851 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:27.654198 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:28.152880 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:28.152951 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:28.153263 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:28.652865 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:28.652944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:28.653286 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:29.152996 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:29.153076 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:29.153423 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:29.153485 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:29.652870 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:29.652937 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:29.653208 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:30.152848 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:30.152931 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:30.153247 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:30.653099 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:30.653178 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:30.653543 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:31.152864 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:31.152933 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:31.153184 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:31.652836 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:31.652916 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:31.653254 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:31.653310 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:32.152845 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:32.152923 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:32.153234 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:32.652869 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:32.652974 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:32.653291 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:33.152813 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:33.152892 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:33.153182 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:33.652873 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:33.652952 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:33.653279 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:33.653339 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:34.152888 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:34.152957 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:34.153206 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:34.653221 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:34.653303 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:34.653662 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:35.153491 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:35.153567 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:35.153923 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:35.653686 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:35.653756 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:35.654034 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:35.654075 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:36.152742 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:36.152852 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:36.153178 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:36.652917 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:36.652991 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:36.653328 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:37.152872 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:37.152944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:37.153269 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:37.652832 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:37.652905 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:37.653225 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:38.152823 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:38.152901 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:38.153256 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:38.153311 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:38.652878 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:38.652945 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:38.653254 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:39.152844 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:39.152917 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:39.153253 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:39.652865 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:39.652948 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:39.653281 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:40.152846 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:40.152936 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:40.153201 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:40.653085 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:40.653160 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:40.653488 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:40.653543 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:41.152787 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:41.152870 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:41.153181 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:41.652770 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:41.652846 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:41.653122 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:42.152887 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:42.152974 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:42.153376 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:42.653103 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:42.653188 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:42.653511 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:42.653570 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:43.152867 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:43.152944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:43.153205 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:43.652856 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:43.652941 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:43.653311 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:44.153027 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:44.153105 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:44.153433 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:44.653459 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:44.653530 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:44.653788 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:44.653840 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:45.153678 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:45.153766 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:45.156105 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=2
	I1219 06:08:45.653114 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:45.653196 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:45.653533 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:46.152871 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:46.152948 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:46.153207 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:46.652877 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:46.652950 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:46.653258 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:47.152995 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:47.153106 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:47.153459 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:47.153515 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:47.652878 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:47.652955 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:47.653273 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:48.152852 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:48.152926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:48.153282 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:48.652874 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:48.652954 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:48.653318 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:49.152869 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:49.152946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:49.153202 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:49.652841 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:49.652924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:49.653230 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:49.653282 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:50.152954 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:50.153027 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:50.153317 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:50.653024 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:50.653102 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:50.653365 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:51.152850 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:51.152924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:51.153219 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:51.652848 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:51.652934 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:51.653275 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:51.653343 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:52.152878 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:52.152957 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:52.153276 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:52.652974 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:52.653059 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:52.653395 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:53.153096 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:53.153174 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:53.153508 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:53.652872 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:53.652951 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:53.653281 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:54.152839 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:54.152916 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:54.153228 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:54.153274 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:54.653199 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:54.653273 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:54.653604 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:55.153420 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:55.153510 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:55.153789 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:55.652811 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:55.652903 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:55.653294 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:56.152856 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:56.152934 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:56.153223 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:56.652860 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:56.652926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:56.653259 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:56.653316 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:57.152833 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:57.152911 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:57.153205 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:57.652906 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:57.652981 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:57.653323 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:58.152864 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:58.152937 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:58.153209 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:58.652840 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:58.652922 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:58.653217 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:59.152905 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:59.152981 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:59.153315 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:59.153381 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:59.652855 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:59.652932 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:59.653183 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:00.152906 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:00.153005 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:00.153357 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:00.653205 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:00.653291 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:00.653625 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:01.153283 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:01.153360 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:01.153628 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:01.153671 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:01.653416 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:01.653497 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:01.653884 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:02.153557 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:02.153633 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:02.154010 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:02.652736 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:02.652830 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:02.653106 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:03.152802 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:03.152877 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:03.153215 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:03.652921 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:03.652999 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:03.653309 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:03.653358 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:04.152885 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:04.152955 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:04.153320 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:04.653344 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:04.653416 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:04.653746 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:05.153560 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:05.153640 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:05.153974 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:05.652768 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:05.652867 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:05.653179 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:06.152859 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:06.152935 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:06.153224 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:06.153272 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:06.652921 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:06.653000 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:06.653306 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:07.152883 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:07.152957 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:07.153227 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:07.652826 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:07.652898 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:07.653243 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:08.152838 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:08.152914 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:08.153262 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:08.153318 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:08.652911 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:08.652979 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:08.653282 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:09.152916 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:09.152986 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:09.153263 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:09.652875 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:09.652954 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:09.653284 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:10.152865 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:10.152942 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:10.153226 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:10.653112 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:10.653192 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:10.653511 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:10.653577 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:11.153350 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:11.153429 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:11.153777 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:11.653085 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:11.653162 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:11.653429 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:12.152806 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:12.152904 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:12.153213 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:12.652905 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:12.652980 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:12.653311 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:13.152912 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:13.152981 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:13.153307 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:13.153367 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:13.653051 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:13.653127 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:13.653415 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:14.152821 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:14.152917 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:14.153215 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:14.653028 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:14.653106 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:14.653360 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:15.152838 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:15.152912 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:15.153210 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:15.653006 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:15.653081 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:15.653415 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:15.653467 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:16.152865 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:16.152934 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:16.153234 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:16.652830 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:16.652908 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:16.653204 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:17.152901 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:17.152974 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:17.153276 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:17.652850 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:17.652923 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:17.653180 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:18.152820 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:18.152893 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:18.153221 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:18.153274 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:18.652906 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:18.652988 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:18.653283 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:19.152887 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:19.152961 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:19.153207 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:19.652913 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:19.652992 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:19.653321 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:20.153018 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:20.153092 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:20.153437 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:20.153501 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:20.653011 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:20.653093 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:20.653372 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:21.153069 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:21.153153 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:21.153484 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:21.652850 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:21.652938 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:21.653322 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:22.153458 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:22.153530 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:22.153790 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:22.153833 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:22.653650 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:22.653724 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:22.654057 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:23.152779 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:23.152853 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:23.153175 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:23.652866 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:23.652964 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:23.653280 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:24.152829 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:24.152918 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:24.153276 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:24.653129 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:24.653203 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:24.653539 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:24.653595 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:25.153181 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:25.153256 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:25.153525 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:25.653492 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:25.653572 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:25.653896 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:26.153709 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:26.153785 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:26.154149 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:26.653439 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:26.653511 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:26.653845 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:26.653912 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:27.153641 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:27.153711 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:27.154059 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:27.653737 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:27.653813 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:27.654171 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:28.152737 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:28.152824 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:28.153136 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:28.652857 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:28.652929 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:28.653267 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:29.152816 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:29.152890 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:29.153304 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:29.153363 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:29.652877 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:29.652949 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:29.653201 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:30.152847 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:30.152926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:30.153286 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:30.653142 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:30.653226 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:30.653576 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:31.152865 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:31.152937 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:31.153239 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:31.652853 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:31.652930 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:31.653285 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:31.653341 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:32.153002 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:32.153077 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:32.153389 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:32.652855 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:32.652928 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:32.653191 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:33.152906 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:33.152985 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:33.153320 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:33.653030 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:33.653112 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:33.653463 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:33.653523 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:34.152867 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:34.152936 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:34.153191 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:34.653266 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:34.653343 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:34.653688 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:35.153480 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:35.153562 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:35.153920 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:35.653700 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:35.653779 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:35.654078 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:35.654124 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:36.152813 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:36.152899 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:36.153244 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:36.652824 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:36.652902 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:36.653244 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:37.152873 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:37.152947 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:37.153200 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:37.652845 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:37.652924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:37.653218 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:38.152831 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:38.152912 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:38.153208 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:38.153253 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:38.652887 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:38.652966 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:38.653228 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:39.152830 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:39.152913 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:39.153278 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:39.652866 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:39.652946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:39.653299 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:40.152861 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:40.152934 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:40.153228 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:40.153287 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:40.653031 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:40.653107 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:40.653447 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:41.152846 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:41.152920 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:41.153249 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:41.652882 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:41.652956 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:41.653222 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:42.152858 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:42.152950 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:42.153290 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:42.153350 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:42.653039 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:42.653114 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:42.653443 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:43.152861 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:43.152929 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:43.153196 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:43.652842 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:43.652917 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:43.653298 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:44.153021 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:44.153098 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:44.153446 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:44.153502 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:44.653394 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:44.653463 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:44.653758 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:45.153716 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:45.153844 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:45.154316 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:45.653443 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:45.653522 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:45.653863 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:46.153623 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:46.153701 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:46.153971 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:46.154014 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:46.653765 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:46.653843 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:46.654187 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:47.152841 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:47.152918 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:47.153244 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:47.652861 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:47.652938 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:47.653190 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:48.152864 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:48.152954 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:48.153355 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:48.653077 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:48.653151 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:48.653475 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:48.653535 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:49.152855 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:49.152929 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:49.153189 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:49.652829 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:49.652905 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:49.653255 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:50.152968 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:50.153052 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:50.153380 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:50.653140 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:50.653211 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:50.653679 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:50.653731 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:51.153473 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:51.153550 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:51.154738 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=1
	I1219 06:09:51.653546 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:51.653618 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:51.653958 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:52.153274 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:52.153349 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:52.153606 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:52.653351 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:52.653426 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:52.653752 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:52.653808 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:53.153430 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:53.153501 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:53.153810 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:53.653040 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:53.653137 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:53.653483 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:54.152833 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:54.152911 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:54.153264 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:54.652950 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:54.653032 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:54.653335 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:55.152873 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:55.152958 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:55.153273 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:55.153315 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:55.653562 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:55.653634 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:55.653988 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:56.152721 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:56.152824 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:56.153183 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:56.652868 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:56.652946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:56.653204 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:57.152895 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:57.152971 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:57.153299 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:57.153359 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:57.652826 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:57.652904 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:57.653235 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:58.152872 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:58.152942 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:58.153268 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:58.652975 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:58.653058 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:58.653396 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:59.153105 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:59.153184 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:59.153474 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:59.153520 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:59.652864 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:59.652944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:59.653267 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:00.152931 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:00.153036 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:00.153356 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:00.653257 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:00.653334 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:00.653658 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:01.153453 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:01.153528 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:01.153794 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:01.153845 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:01.653619 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:01.653697 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:01.654088 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:02.153731 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:02.153810 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:02.154155 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:02.652784 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:02.652868 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:02.653195 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:03.152838 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:03.152917 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:03.153278 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:03.652980 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:03.653056 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:03.653404 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:03.653465 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:04.152871 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:04.152950 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:04.153292 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:04.653187 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:04.653267 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:04.653580 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:05.152837 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:05.152916 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:05.153255 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:05.652984 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:05.653057 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:05.653348 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:06.153037 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:06.153117 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:06.153467 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:06.153522 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:06.653186 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:06.653261 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:06.653599 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:07.152860 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:07.152929 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:07.153189 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:07.652828 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:07.652907 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:07.653267 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:08.152859 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:08.152944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:08.153282 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:08.652877 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:08.652946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:08.653201 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:08.653242 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:09.152822 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:09.152898 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:09.153263 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:09.652859 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:09.652947 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:09.653294 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:10.152871 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:10.152950 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:10.153225 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:10.653128 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:10.653226 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:10.653575 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:10.653636 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:11.153427 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:11.153513 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:11.153854 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:11.653325 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:11.653393 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:11.653695 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:12.153493 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:12.153567 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:12.153867 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:12.653672 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:12.653754 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:12.654079 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:12.654129 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:13.152786 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:13.152870 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:13.153265 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:13.652847 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:13.652923 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:13.653234 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:14.152794 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:14.152866 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:14.153184 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:14.653147 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:14.653224 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:14.653478 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:15.152818 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:15.152898 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:15.153253 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:15.153301 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:15.653095 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:15.653176 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:15.653536 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:16.152872 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:16.152948 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:16.153210 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:16.652894 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:16.652982 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:16.653351 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:17.152883 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:17.152963 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:17.153312 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:17.153367 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:17.652869 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:17.652943 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:17.653235 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:18.152825 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:18.152903 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:18.153264 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:18.652851 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:18.652930 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:18.653272 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:19.152844 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:19.152921 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:19.153179 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:19.652851 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:19.652930 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:19.653284 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:19.653346 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:20.152854 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:20.152932 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:20.153299 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:20.653045 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:20.653125 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:20.653445 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:21.153150 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:21.153227 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:21.153575 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:21.652847 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:21.652931 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:21.653260 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:22.152864 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:22.152940 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:22.153207 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:22.153260 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:22.652895 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:22.652974 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:22.653308 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:23.152830 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:23.152911 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:23.153198 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:23.653479 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:23.653551 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:23.653818 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:24.153710 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:24.153800 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:24.154142 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:24.154201 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:24.653230 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:24.653310 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:24.653643 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:25.153415 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:25.153494 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:25.153825 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:25.652780 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:25.652869 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:25.653238 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:26.152955 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:26.153029 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:26.153332 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:26.652857 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:26.652926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:26.653203 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:26.653244 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:27.152819 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:27.152893 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:27.153237 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:27.652953 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:27.653040 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:27.653394 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:28.152864 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:28.152939 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:28.153198 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:28.652846 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:28.652921 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:28.653239 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:28.653296 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:29.153007 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:29.153109 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:29.153490 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:29.652904 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:29.653002 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:29.653393 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:30.152859 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:30.152941 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:30.153290 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:30.653036 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:30.653110 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:30.653469 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:30.653528 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:31.152866 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:31.152940 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:31.153201 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:31.652841 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:31.652917 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:31.653256 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:32.152981 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:32.153059 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:32.153421 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:32.652862 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:32.652934 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:32.653184 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:33.152815 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:33.152902 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:33.153207 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:33.153256 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:33.652858 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:33.652936 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:33.653273 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:34.153699 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:34.153778 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:34.154156 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:34.652927 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:34.653004 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:34.653344 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:35.152944 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:35.153053 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:35.153407 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:35.153473 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:35.653031 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:35.653108 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:35.653439 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:36.152985 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:36.153058 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:36.153410 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:36.652997 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:36.653074 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:36.653385 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:37.152858 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:37.152927 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:37.153178 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:37.652820 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:37.652898 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:37.653238 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:37.653293 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:38.152835 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:38.152918 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:38.153307 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:38.652893 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:38.652984 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:38.653356 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:39.152830 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:39.152909 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:39.153250 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:39.652830 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:39.652914 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:39.653251 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:40.152861 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:40.152939 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:40.153215 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:40.153267 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:40.653044 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:40.653127 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:40.653472 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:41.152846 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:41.152924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:41.153236 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:41.652860 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:41.652938 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:41.653202 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:42.152888 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:42.152978 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:42.153360 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:42.153425 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:42.653108 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:42.653190 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:42.653535 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:43.153293 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:43.153377 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:43.153699 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:43.653511 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:43.653596 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:43.653946 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:44.153626 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:44.153701 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:44.154058 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:44.154116 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:44.653518 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:44.653586 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:44.653839 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:45.153714 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:45.153803 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:45.154242 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:45.653103 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:45.653182 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:45.653567 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:46.152856 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:46.152937 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:46.153239 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:46.652826 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:46.652901 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:46.653220 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:46.653276 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:47.153000 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:47.153090 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:47.153484 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:47.652874 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:47.652945 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:47.653216 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:48.152829 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:48.152923 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:48.153272 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:48.653001 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:48.653085 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:48.653425 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:48.653480 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:49.152866 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:49.152944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:49.153206 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:49.652909 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:49.652997 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:49.653352 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:50.152932 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:50.153010 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:50.153347 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:50.653023 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:50.653100 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:50.653383 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:51.153061 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:51.153137 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:51.153452 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:51.153512 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:51.653210 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:51.653296 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:51.653657 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:52.153489 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:52.153557 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:52.153876 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:52.653692 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:52.653768 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:52.654090 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:53.152787 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:53.152864 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:53.153157 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:53.652784 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:53.652862 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:53.653125 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:53.653167 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:54.152851 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:54.152939 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:54.153283 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:54.653187 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:54.653265 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:54.653642 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:55.153555 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:55.153638 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:55.153984 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:55.652996 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:55.653078 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:55.653399 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:55.653461 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:56.153144 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:56.153229 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:56.153575 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:56.653124 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:56.653197 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:56.653482 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:57.152912 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:57.152988 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:57.153306 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:57.652848 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:57.652930 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:57.653287 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:58.152883 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:58.152951 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:58.153228 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:58.153273 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:58.652822 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:58.652903 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:58.653236 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:59.152792 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:59.152869 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:59.153185 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:59.652732 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:59.652827 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:59.653140 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:00.152931 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:00.153022 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:00.153339 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:00.153391 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:00.653248 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:00.653323 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:00.653669 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:01.153449 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:01.153521 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:01.153868 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:01.653237 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:01.653336 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:01.653684 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:02.153489 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:02.153575 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:02.153909 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:02.153978 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:02.653742 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:02.653822 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:02.654154 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:03.152847 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:03.152925 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:03.153263 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:03.652860 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:03.652939 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:03.653311 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:04.152862 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:04.152937 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:04.153190 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:04.653274 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:04.653353 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:04.653687 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:04.653753 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:05.153511 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:05.153585 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:05.153947 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:05.652710 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:05.652796 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:05.653061 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:06.152795 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:06.152871 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:06.153232 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:06.652955 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:06.653039 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:06.653379 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:07.152874 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:07.152959 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:07.153220 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:07.153260 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:07.652857 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:07.652932 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:07.653275 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:08.152847 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:08.152921 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:08.153280 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:08.652915 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:08.652997 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:08.653258 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:09.152863 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:09.152938 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:09.153313 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:09.153369 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:09.653051 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:09.653130 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:09.653466 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:10.152899 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:10.152977 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:10.153296 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:10.653043 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:10.653165 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:10.653488 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:11.152868 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:11.152944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:11.153273 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:11.653413 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:11.653483 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:11.653817 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:11.653864 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:12.153612 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:12.153686 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:12.154026 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:12.652742 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:12.652850 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:12.653128 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:13.152820 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:13.152886 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:13.153131 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:13.652843 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:13.652924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:13.653273 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:14.152867 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:14.152944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:14.153275 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:14.153331 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:14.653226 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:14.653309 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:14.653648 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:15.153413 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:15.153488 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:15.153804 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:15.652744 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:15.652844 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:15.653142 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:16.152784 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:16.152853 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:16.153159 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:16.652816 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:16.652891 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:16.653186 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:16.653233 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:17.152845 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:17.152926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:17.153251 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:17.652870 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:17.652936 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:17.653201 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:18.152833 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:18.152914 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:18.153264 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:18.652945 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:18.653025 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:18.653318 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:18.653380 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:19.152875 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:19.152949 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:19.153231 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:19.652844 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:19.652920 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:19.653256 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:20.152860 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:20.152935 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:20.153299 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:20.653014 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:20.653092 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:20.653351 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:21.152845 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:21.152918 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:21.153277 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:21.153341 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:21.653010 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:21.653087 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:21.653481 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:22.152850 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:22.152924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:22.153186 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:22.652828 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:22.652906 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:22.653231 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:23.152837 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:23.152923 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:23.153272 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:23.652867 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:23.652942 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:23.653201 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:23.653241 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:24.152900 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:24.152984 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:24.153313 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:24.653189 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:24.653270 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:24.653611 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:25.152919 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:25.152989 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:25.153298 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:25.653028 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:25.653101 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:25.653438 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:25.653492 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:26.153176 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:26.153256 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:26.153570 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:26.652912 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:26.652986 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:26.653378 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:27.152845 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:27.152921 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:27.153259 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:27.652839 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:27.652922 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:27.653245 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:28.152867 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:28.152938 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:28.153233 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:28.153276 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:28.652833 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:28.652907 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:28.653249 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:29.152989 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:29.153064 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:29.153396 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:29.652869 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:29.652941 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:29.653239 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:30.152845 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:30.152926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:30.153258 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:30.153317 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:30.653045 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:30.653118 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:30.653429 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:31.152869 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:31.152943 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:31.153207 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:31.652855 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:31.652930 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:31.653292 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:32.152836 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:32.152919 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:32.153241 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:32.652864 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:32.652940 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:32.653216 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:32.653263 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:33.152791 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:33.152875 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:33.153206 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:33.652896 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:33.652977 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:33.653320 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:34.152876 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:34.152959 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:34.153289 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:34.653352 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:34.653430 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:34.653807 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:34.653869 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:35.153637 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:35.153718 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:35.154044 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:35.652946 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:35.653021 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:35.653340 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:36.152965 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:36.153049 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:36.153384 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:36.653133 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:36.653213 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:36.653535 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:37.153283 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:37.153355 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:37.153669 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:37.153731 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:37.653516 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:37.653597 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:37.653938 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:38.153755 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:38.153833 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:38.154248 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:38.652881 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:38.652956 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:38.653272 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:39.152844 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:39.152925 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:39.153296 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:39.652881 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:39.652964 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:39.653289 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:39.653347 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:40.152873 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:40.152948 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:40.153212 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:40.653026 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:40.653107 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:40.653447 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:41.152846 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:41.152921 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:41.153249 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:41.652949 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:41.653030 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:41.653297 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:42.152907 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:42.153021 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:42.153435 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:42.153505 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:42.653182 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:42.653258 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:42.653594 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:43.152852 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:43.152926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:43.153183 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:43.652858 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:43.652938 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:43.653276 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:44.152855 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:44.152938 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:44.153264 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:44.653236 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:44.653312 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:44.653574 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:44.653614 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:45.153508 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:45.153630 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:45.154114 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:45.653037 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:45.653120 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:45.653478 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:46.152854 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:46.152928 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:46.153280 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:46.652857 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:46.652941 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:46.653286 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:47.152995 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:47.153070 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:47.153369 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:47.153415 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:47.652884 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:47.652954 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:47.653305 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:48.152817 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:48.152895 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:48.153237 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:48.652848 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:48.652925 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:48.653243 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:49.152858 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:49.152943 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:49.153213 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:49.652807 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:49.652884 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:49.653230 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:49.653285 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:50.152972 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:50.153050 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:50.153344 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:50.653187 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:50.653264 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:50.653522 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:51.153213 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:51.153287 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:51.153583 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:51.653360 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:51.653435 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:51.653779 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:51.653840 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:52.153070 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:52.153143 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:52.153403 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:52.653101 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:52.653190 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:52.653483 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:53.152800 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:53.152878 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:53.153196 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:53.652868 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:53.652937 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:53.653274 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:54.152820 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:54.152901 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:54.153215 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:54.153274 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:54.653291 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:54.653363 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:54.653706 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:55.152998 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:55.153089 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:55.153429 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:55.653064 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:55.653125 2059048 node_ready.go:38] duration metric: took 6m0.000540604s for node "functional-006924" to be "Ready" ...
	I1219 06:11:55.656290 2059048 out.go:203] 
	W1219 06:11:55.659114 2059048 out.go:285] X Exiting due to GUEST_START: failed to start node: wait 6m0s for node: waiting for node to be ready: WaitNodeCondition: context deadline exceeded
	W1219 06:11:55.659135 2059048 out.go:285] * 
	W1219 06:11:55.661307 2059048 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1219 06:11:55.664349 2059048 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 19 06:05:53 functional-006924 containerd[5249]: time="2025-12-19T06:05:53.106546939Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1
	Dec 19 06:05:53 functional-006924 containerd[5249]: time="2025-12-19T06:05:53.106615724Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1
	Dec 19 06:05:53 functional-006924 containerd[5249]: time="2025-12-19T06:05:53.106710289Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1
	Dec 19 06:05:53 functional-006924 containerd[5249]: time="2025-12-19T06:05:53.106798117Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1
	Dec 19 06:05:53 functional-006924 containerd[5249]: time="2025-12-19T06:05:53.106860600Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 19 06:05:53 functional-006924 containerd[5249]: time="2025-12-19T06:05:53.106921951Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 19 06:05:53 functional-006924 containerd[5249]: time="2025-12-19T06:05:53.106979256Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1
	Dec 19 06:05:53 functional-006924 containerd[5249]: time="2025-12-19T06:05:53.107036249Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1
	Dec 19 06:05:53 functional-006924 containerd[5249]: time="2025-12-19T06:05:53.107120804Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1
	Dec 19 06:05:53 functional-006924 containerd[5249]: time="2025-12-19T06:05:53.107220809Z" level=info msg="Connect containerd service"
	Dec 19 06:05:53 functional-006924 containerd[5249]: time="2025-12-19T06:05:53.107615078Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this"
	Dec 19 06:05:53 functional-006924 containerd[5249]: time="2025-12-19T06:05:53.108301575Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config"
	Dec 19 06:05:53 functional-006924 containerd[5249]: time="2025-12-19T06:05:53.121412760Z" level=info msg="Start subscribing containerd event"
	Dec 19 06:05:53 functional-006924 containerd[5249]: time="2025-12-19T06:05:53.121497799Z" level=info msg="Start recovering state"
	Dec 19 06:05:53 functional-006924 containerd[5249]: time="2025-12-19T06:05:53.122272830Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc
	Dec 19 06:05:53 functional-006924 containerd[5249]: time="2025-12-19T06:05:53.122475040Z" level=info msg=serving... address=/run/containerd/containerd.sock
	Dec 19 06:05:53 functional-006924 containerd[5249]: time="2025-12-19T06:05:53.164704607Z" level=info msg="Start event monitor"
	Dec 19 06:05:53 functional-006924 containerd[5249]: time="2025-12-19T06:05:53.164941434Z" level=info msg="Start cni network conf syncer for default"
	Dec 19 06:05:53 functional-006924 containerd[5249]: time="2025-12-19T06:05:53.165030970Z" level=info msg="Start streaming server"
	Dec 19 06:05:53 functional-006924 containerd[5249]: time="2025-12-19T06:05:53.165132911Z" level=info msg="Registered namespace \"k8s.io\" with NRI"
	Dec 19 06:05:53 functional-006924 containerd[5249]: time="2025-12-19T06:05:53.165363388Z" level=info msg="runtime interface starting up..."
	Dec 19 06:05:53 functional-006924 containerd[5249]: time="2025-12-19T06:05:53.165439754Z" level=info msg="starting plugins..."
	Dec 19 06:05:53 functional-006924 containerd[5249]: time="2025-12-19T06:05:53.165505280Z" level=info msg="Synchronizing NRI (plugin) with current runtime state"
	Dec 19 06:05:53 functional-006924 systemd[1]: Started containerd.service - containerd container runtime.
	Dec 19 06:05:53 functional-006924 containerd[5249]: time="2025-12-19T06:05:53.167961084Z" level=info msg="containerd successfully booted in 0.088140s"
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:11:59.810015    8586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:11:59.810440    8586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:11:59.811769    8586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:11:59.812557    8586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:11:59.814334    8586 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec19 04:47] overlayfs: idmapped layers are currently not supported
	[Dec19 04:48] overlayfs: idmapped layers are currently not supported
	[Dec19 04:49] overlayfs: idmapped layers are currently not supported
	[Dec19 04:51] overlayfs: idmapped layers are currently not supported
	[Dec19 04:53] overlayfs: idmapped layers are currently not supported
	[Dec19 05:03] overlayfs: idmapped layers are currently not supported
	[Dec19 05:04] overlayfs: idmapped layers are currently not supported
	[Dec19 05:05] overlayfs: idmapped layers are currently not supported
	[Dec19 05:06] overlayfs: idmapped layers are currently not supported
	[ +12.793339] overlayfs: idmapped layers are currently not supported
	[Dec19 05:07] overlayfs: idmapped layers are currently not supported
	[Dec19 05:08] overlayfs: idmapped layers are currently not supported
	[Dec19 05:09] overlayfs: idmapped layers are currently not supported
	[Dec19 05:10] overlayfs: idmapped layers are currently not supported
	[Dec19 05:11] overlayfs: idmapped layers are currently not supported
	[Dec19 05:13] overlayfs: idmapped layers are currently not supported
	[Dec19 05:14] overlayfs: idmapped layers are currently not supported
	[Dec19 05:32] overlayfs: idmapped layers are currently not supported
	[Dec19 05:33] overlayfs: idmapped layers are currently not supported
	[Dec19 05:35] overlayfs: idmapped layers are currently not supported
	[Dec19 05:36] overlayfs: idmapped layers are currently not supported
	[Dec19 05:38] overlayfs: idmapped layers are currently not supported
	[Dec19 05:39] overlayfs: idmapped layers are currently not supported
	[Dec19 05:40] overlayfs: idmapped layers are currently not supported
	[Dec19 05:42] kauditd_printk_skb: 8 callbacks suppressed
	
	
	==> kernel <==
	 06:11:59 up 10:54,  0 user,  load average: 0.20, 0.29, 0.73
	Linux functional-006924 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 19 06:11:56 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 06:11:57 functional-006924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 811.
	Dec 19 06:11:57 functional-006924 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:11:57 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:11:57 functional-006924 kubelet[8441]: E1219 06:11:57.462990    8441 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 06:11:57 functional-006924 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 06:11:57 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 06:11:58 functional-006924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 812.
	Dec 19 06:11:58 functional-006924 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:11:58 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:11:58 functional-006924 kubelet[8462]: E1219 06:11:58.233409    8462 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 06:11:58 functional-006924 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 06:11:58 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 06:11:58 functional-006924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 813.
	Dec 19 06:11:58 functional-006924 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:11:58 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:11:58 functional-006924 kubelet[8498]: E1219 06:11:58.960479    8498 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 06:11:58 functional-006924 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 06:11:58 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 06:11:59 functional-006924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 814.
	Dec 19 06:11:59 functional-006924 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:11:59 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:11:59 functional-006924 kubelet[8565]: E1219 06:11:59.716909    8565 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 06:11:59 functional-006924 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 06:11:59 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:263: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-006924 -n functional-006924
helpers_test.go:263: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-006924 -n functional-006924: exit status 2 (365.820918ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:263: status error: exit status 2 (may be ok)
helpers_test.go:265: "functional-006924" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/KubectlGetPods (2.67s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmd (2.24s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmd
functional_test.go:731: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 kubectl -- --context functional-006924 get pods
functional_test.go:731: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-006924 kubectl -- --context functional-006924 get pods: exit status 1 (107.194171ms)

                                                
                                                
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:734: failed to get pods. args "out/minikube-linux-arm64 -p functional-006924 kubectl -- --context functional-006924 get pods": exit status 1
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:224: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmd]: network settings <======
helpers_test.go:231: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:239: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmd]: docker inspect <======
helpers_test.go:240: (dbg) Run:  docker inspect functional-006924
helpers_test.go:244: (dbg) docker inspect functional-006924:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6",
	        "Created": "2025-12-19T05:57:32.987616309Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 2053574,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-19T05:57:33.050252475Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:1411dfa4fea1291ce69fcd55acb99f3fbff3e701cee30fdd4f0b2561ac0ef6b0",
	        "ResolvConfPath": "/var/lib/docker/containers/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6/hostname",
	        "HostsPath": "/var/lib/docker/containers/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6/hosts",
	        "LogPath": "/var/lib/docker/containers/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6-json.log",
	        "Name": "/functional-006924",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-006924:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-006924",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6",
	                "LowerDir": "/var/lib/docker/overlay2/37e64f46ab514c62069e4338bcac4816059c7746e8935882d7526ec5f3a01f73-init/diff:/var/lib/docker/overlay2/00358d85eab3b52f9d297862c5ac97673efd866f7bb8f8781bf0c1744f50abc5/diff",
	                "MergedDir": "/var/lib/docker/overlay2/37e64f46ab514c62069e4338bcac4816059c7746e8935882d7526ec5f3a01f73/merged",
	                "UpperDir": "/var/lib/docker/overlay2/37e64f46ab514c62069e4338bcac4816059c7746e8935882d7526ec5f3a01f73/diff",
	                "WorkDir": "/var/lib/docker/overlay2/37e64f46ab514c62069e4338bcac4816059c7746e8935882d7526ec5f3a01f73/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-006924",
	                "Source": "/var/lib/docker/volumes/functional-006924/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-006924",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-006924",
	                "name.minikube.sigs.k8s.io": "functional-006924",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "c06ab2bd44169716d410789ed39ed6e7c04e20cbf7fddb96691439282b9c97ca",
	            "SandboxKey": "/var/run/docker/netns/c06ab2bd4416",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34704"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34705"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34708"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34706"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34707"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-006924": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "42:2f:87:6a:a8:7b",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "f63e8dc2cff83663f8a4d14108f192e61e457410fa4fc720cd9630dbf354815d",
	                    "EndpointID": "aa2b1cbd90d5c1f6130481423d97f82d974d4197e41ad0dbe3b7e51b22c8b4cc",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-006924",
	                        "651d0d6ef1db"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:248: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-006924 -n functional-006924
helpers_test.go:248: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-006924 -n functional-006924: exit status 2 (303.088129ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:248: status error: exit status 2 (may be ok)
helpers_test.go:253: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmd FAILED: start of post-mortem logs <<<
helpers_test.go:254: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmd]: minikube logs <======
helpers_test.go:256: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 logs -n 25
helpers_test.go:256: (dbg) Done: out/minikube-linux-arm64 -p functional-006924 logs -n 25: (1.000193958s)
helpers_test.go:261: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmd logs: 
-- stdout --
	
	==> Audit <==
	┌────────────────┬───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│    COMMAND     │                                                                         ARGS                                                                          │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├────────────────┼───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ image          │ functional-125117 image build -t localhost/my-image:functional-125117 testdata/build --alsologtostderr                                                │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image ls                                                                                                                            │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image ls --format json --alsologtostderr                                                                                            │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image ls --format table --alsologtostderr                                                                                           │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ update-context │ functional-125117 update-context --alsologtostderr -v=2                                                                                               │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ update-context │ functional-125117 update-context --alsologtostderr -v=2                                                                                               │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ update-context │ functional-125117 update-context --alsologtostderr -v=2                                                                                               │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ delete         │ -p functional-125117                                                                                                                                  │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ start          │ -p functional-006924 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-rc.1 │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │                     │
	│ start          │ -p functional-006924 --alsologtostderr -v=8                                                                                                           │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:05 UTC │                     │
	│ cache          │ functional-006924 cache add registry.k8s.io/pause:3.1                                                                                                 │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ cache          │ functional-006924 cache add registry.k8s.io/pause:3.3                                                                                                 │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ cache          │ functional-006924 cache add registry.k8s.io/pause:latest                                                                                              │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ cache          │ functional-006924 cache add minikube-local-cache-test:functional-006924                                                                               │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ cache          │ functional-006924 cache delete minikube-local-cache-test:functional-006924                                                                            │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ cache          │ delete registry.k8s.io/pause:3.3                                                                                                                      │ minikube          │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ cache          │ list                                                                                                                                                  │ minikube          │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ ssh            │ functional-006924 ssh sudo crictl images                                                                                                              │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ ssh            │ functional-006924 ssh sudo crictl rmi registry.k8s.io/pause:latest                                                                                    │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ ssh            │ functional-006924 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                               │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │                     │
	│ cache          │ functional-006924 cache reload                                                                                                                        │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ ssh            │ functional-006924 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                               │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ cache          │ delete registry.k8s.io/pause:3.1                                                                                                                      │ minikube          │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ cache          │ delete registry.k8s.io/pause:latest                                                                                                                   │ minikube          │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ kubectl        │ functional-006924 kubectl -- --context functional-006924 get pods                                                                                     │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │                     │
	└────────────────┴───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/19 06:05:50
	Running on machine: ip-172-31-30-239
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1219 06:05:50.537990 2059048 out.go:360] Setting OutFile to fd 1 ...
	I1219 06:05:50.538849 2059048 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 06:05:50.538894 2059048 out.go:374] Setting ErrFile to fd 2...
	I1219 06:05:50.538913 2059048 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 06:05:50.539188 2059048 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22230-1998525/.minikube/bin
	I1219 06:05:50.539610 2059048 out.go:368] Setting JSON to false
	I1219 06:05:50.540502 2059048 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":38897,"bootTime":1766085454,"procs":155,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1219 06:05:50.540601 2059048 start.go:143] virtualization:  
	I1219 06:05:50.544140 2059048 out.go:179] * [functional-006924] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1219 06:05:50.547152 2059048 out.go:179]   - MINIKUBE_LOCATION=22230
	I1219 06:05:50.547218 2059048 notify.go:221] Checking for updates...
	I1219 06:05:50.550931 2059048 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1219 06:05:50.553869 2059048 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22230-1998525/kubeconfig
	I1219 06:05:50.556730 2059048 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22230-1998525/.minikube
	I1219 06:05:50.559634 2059048 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1219 06:05:50.562450 2059048 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1219 06:05:50.565702 2059048 config.go:182] Loaded profile config "functional-006924": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1219 06:05:50.565828 2059048 driver.go:422] Setting default libvirt URI to qemu:///system
	I1219 06:05:50.590709 2059048 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1219 06:05:50.590846 2059048 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1219 06:05:50.653898 2059048 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-19 06:05:50.644590744 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1219 06:05:50.654020 2059048 docker.go:319] overlay module found
	I1219 06:05:50.657204 2059048 out.go:179] * Using the docker driver based on existing profile
	I1219 06:05:50.660197 2059048 start.go:309] selected driver: docker
	I1219 06:05:50.660214 2059048 start.go:928] validating driver "docker" against &{Name:functional-006924 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APISer
verName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableC
oreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1219 06:05:50.660310 2059048 start.go:939] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1219 06:05:50.660408 2059048 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1219 06:05:50.713439 2059048 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-19 06:05:50.704333478 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1219 06:05:50.713872 2059048 cni.go:84] Creating CNI manager for ""
	I1219 06:05:50.713935 2059048 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1219 06:05:50.713992 2059048 start.go:353] cluster config:
	{Name:functional-006924 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Con
tainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath:
StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1219 06:05:50.717210 2059048 out.go:179] * Starting "functional-006924" primary control-plane node in "functional-006924" cluster
	I1219 06:05:50.719980 2059048 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1219 06:05:50.722935 2059048 out.go:179] * Pulling base image v0.0.48-1765966054-22186 ...
	I1219 06:05:50.726070 2059048 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1219 06:05:50.726124 2059048 preload.go:203] Found local preload: /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4
	I1219 06:05:50.726135 2059048 cache.go:65] Caching tarball of preloaded images
	I1219 06:05:50.726179 2059048 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon
	I1219 06:05:50.726225 2059048 preload.go:238] Found /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4 in cache, skipping download
	I1219 06:05:50.726236 2059048 cache.go:68] Finished verifying existence of preloaded tar for v1.35.0-rc.1 on containerd
	I1219 06:05:50.726339 2059048 profile.go:143] Saving config to /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/config.json ...
	I1219 06:05:50.745888 2059048 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon, skipping pull
	I1219 06:05:50.745915 2059048 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 exists in daemon, skipping load
	I1219 06:05:50.745932 2059048 cache.go:243] Successfully downloaded all kic artifacts
	I1219 06:05:50.745963 2059048 start.go:360] acquireMachinesLock for functional-006924: {Name:mkc84f48e83d18024791d45db780f3ccd746613a Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1219 06:05:50.746023 2059048 start.go:364] duration metric: took 37.752µs to acquireMachinesLock for "functional-006924"
	I1219 06:05:50.746049 2059048 start.go:96] Skipping create...Using existing machine configuration
	I1219 06:05:50.746059 2059048 fix.go:54] fixHost starting: 
	I1219 06:05:50.746334 2059048 cli_runner.go:164] Run: docker container inspect functional-006924 --format={{.State.Status}}
	I1219 06:05:50.762745 2059048 fix.go:112] recreateIfNeeded on functional-006924: state=Running err=<nil>
	W1219 06:05:50.762777 2059048 fix.go:138] unexpected machine state, will restart: <nil>
	I1219 06:05:50.765990 2059048 out.go:252] * Updating the running docker "functional-006924" container ...
	I1219 06:05:50.766020 2059048 machine.go:94] provisionDockerMachine start ...
	I1219 06:05:50.766101 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:50.782668 2059048 main.go:144] libmachine: Using SSH client type: native
	I1219 06:05:50.783000 2059048 main.go:144] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db5e0] 0x3ddae0 <nil>  [] 0s} 127.0.0.1 34704 <nil> <nil>}
	I1219 06:05:50.783017 2059048 main.go:144] libmachine: About to run SSH command:
	hostname
	I1219 06:05:50.940618 2059048 main.go:144] libmachine: SSH cmd err, output: <nil>: functional-006924
	
	I1219 06:05:50.940641 2059048 ubuntu.go:182] provisioning hostname "functional-006924"
	I1219 06:05:50.940708 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:50.964854 2059048 main.go:144] libmachine: Using SSH client type: native
	I1219 06:05:50.965181 2059048 main.go:144] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db5e0] 0x3ddae0 <nil>  [] 0s} 127.0.0.1 34704 <nil> <nil>}
	I1219 06:05:50.965199 2059048 main.go:144] libmachine: About to run SSH command:
	sudo hostname functional-006924 && echo "functional-006924" | sudo tee /etc/hostname
	I1219 06:05:51.129720 2059048 main.go:144] libmachine: SSH cmd err, output: <nil>: functional-006924
	
	I1219 06:05:51.129816 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:51.147357 2059048 main.go:144] libmachine: Using SSH client type: native
	I1219 06:05:51.147663 2059048 main.go:144] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db5e0] 0x3ddae0 <nil>  [] 0s} 127.0.0.1 34704 <nil> <nil>}
	I1219 06:05:51.147686 2059048 main.go:144] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-006924' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-006924/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-006924' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1219 06:05:51.301923 2059048 main.go:144] libmachine: SSH cmd err, output: <nil>: 
	I1219 06:05:51.301949 2059048 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/22230-1998525/.minikube CaCertPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/22230-1998525/.minikube}
	I1219 06:05:51.301977 2059048 ubuntu.go:190] setting up certificates
	I1219 06:05:51.301985 2059048 provision.go:84] configureAuth start
	I1219 06:05:51.302047 2059048 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-006924
	I1219 06:05:51.323653 2059048 provision.go:143] copyHostCerts
	I1219 06:05:51.323700 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.pem
	I1219 06:05:51.323742 2059048 exec_runner.go:144] found /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.pem, removing ...
	I1219 06:05:51.323756 2059048 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.pem
	I1219 06:05:51.323832 2059048 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.pem (1078 bytes)
	I1219 06:05:51.323915 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/22230-1998525/.minikube/cert.pem
	I1219 06:05:51.323932 2059048 exec_runner.go:144] found /home/jenkins/minikube-integration/22230-1998525/.minikube/cert.pem, removing ...
	I1219 06:05:51.323937 2059048 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22230-1998525/.minikube/cert.pem
	I1219 06:05:51.323964 2059048 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/22230-1998525/.minikube/cert.pem (1123 bytes)
	I1219 06:05:51.324003 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/22230-1998525/.minikube/key.pem
	I1219 06:05:51.324018 2059048 exec_runner.go:144] found /home/jenkins/minikube-integration/22230-1998525/.minikube/key.pem, removing ...
	I1219 06:05:51.324022 2059048 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22230-1998525/.minikube/key.pem
	I1219 06:05:51.324044 2059048 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/22230-1998525/.minikube/key.pem (1671 bytes)
	I1219 06:05:51.324090 2059048 provision.go:117] generating server cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca-key.pem org=jenkins.functional-006924 san=[127.0.0.1 192.168.49.2 functional-006924 localhost minikube]
	I1219 06:05:51.441821 2059048 provision.go:177] copyRemoteCerts
	I1219 06:05:51.441886 2059048 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1219 06:05:51.441926 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:51.459787 2059048 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:05:51.570296 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I1219 06:05:51.570372 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1219 06:05:51.588363 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server.pem -> /etc/docker/server.pem
	I1219 06:05:51.588477 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1219 06:05:51.605684 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I1219 06:05:51.605798 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1219 06:05:51.623473 2059048 provision.go:87] duration metric: took 321.473451ms to configureAuth
	I1219 06:05:51.623556 2059048 ubuntu.go:206] setting minikube options for container-runtime
	I1219 06:05:51.623741 2059048 config.go:182] Loaded profile config "functional-006924": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1219 06:05:51.623756 2059048 machine.go:97] duration metric: took 857.728961ms to provisionDockerMachine
	I1219 06:05:51.623765 2059048 start.go:293] postStartSetup for "functional-006924" (driver="docker")
	I1219 06:05:51.623788 2059048 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1219 06:05:51.623849 2059048 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1219 06:05:51.623892 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:51.641371 2059048 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:05:51.760842 2059048 ssh_runner.go:195] Run: cat /etc/os-release
	I1219 06:05:51.764225 2059048 command_runner.go:130] > PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	I1219 06:05:51.764245 2059048 command_runner.go:130] > NAME="Debian GNU/Linux"
	I1219 06:05:51.764250 2059048 command_runner.go:130] > VERSION_ID="12"
	I1219 06:05:51.764255 2059048 command_runner.go:130] > VERSION="12 (bookworm)"
	I1219 06:05:51.764259 2059048 command_runner.go:130] > VERSION_CODENAME=bookworm
	I1219 06:05:51.764263 2059048 command_runner.go:130] > ID=debian
	I1219 06:05:51.764268 2059048 command_runner.go:130] > HOME_URL="https://www.debian.org/"
	I1219 06:05:51.764273 2059048 command_runner.go:130] > SUPPORT_URL="https://www.debian.org/support"
	I1219 06:05:51.764281 2059048 command_runner.go:130] > BUG_REPORT_URL="https://bugs.debian.org/"
	I1219 06:05:51.764323 2059048 main.go:144] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1219 06:05:51.764339 2059048 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1219 06:05:51.764350 2059048 filesync.go:126] Scanning /home/jenkins/minikube-integration/22230-1998525/.minikube/addons for local assets ...
	I1219 06:05:51.764404 2059048 filesync.go:126] Scanning /home/jenkins/minikube-integration/22230-1998525/.minikube/files for local assets ...
	I1219 06:05:51.764485 2059048 filesync.go:149] local asset: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem -> 20003862.pem in /etc/ssl/certs
	I1219 06:05:51.764491 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem -> /etc/ssl/certs/20003862.pem
	I1219 06:05:51.764572 2059048 filesync.go:149] local asset: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/test/nested/copy/2000386/hosts -> hosts in /etc/test/nested/copy/2000386
	I1219 06:05:51.764576 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/test/nested/copy/2000386/hosts -> /etc/test/nested/copy/2000386/hosts
	I1219 06:05:51.764619 2059048 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/2000386
	I1219 06:05:51.772196 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem --> /etc/ssl/certs/20003862.pem (1708 bytes)
	I1219 06:05:51.790438 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/test/nested/copy/2000386/hosts --> /etc/test/nested/copy/2000386/hosts (40 bytes)
	I1219 06:05:51.808099 2059048 start.go:296] duration metric: took 184.303334ms for postStartSetup
	I1219 06:05:51.808203 2059048 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1219 06:05:51.808277 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:51.825566 2059048 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:05:51.929610 2059048 command_runner.go:130] > 14%
	I1219 06:05:51.930200 2059048 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1219 06:05:51.934641 2059048 command_runner.go:130] > 169G
	I1219 06:05:51.935117 2059048 fix.go:56] duration metric: took 1.189053781s for fixHost
	I1219 06:05:51.935139 2059048 start.go:83] releasing machines lock for "functional-006924", held for 1.189101272s
	I1219 06:05:51.935225 2059048 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-006924
	I1219 06:05:51.954055 2059048 ssh_runner.go:195] Run: cat /version.json
	I1219 06:05:51.954105 2059048 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1219 06:05:51.954110 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:51.954164 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:51.979421 2059048 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:05:51.998216 2059048 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:05:52.088735 2059048 command_runner.go:130] > {"iso_version": "v1.37.0-1765846775-22141", "kicbase_version": "v0.0.48-1765966054-22186", "minikube_version": "v1.37.0", "commit": "c344550999bcbb78f38b2df057224788bb2d30b2"}
	I1219 06:05:52.088901 2059048 ssh_runner.go:195] Run: systemctl --version
	I1219 06:05:52.184102 2059048 command_runner.go:130] > <a href="https://github.com/kubernetes/registry.k8s.io">Temporary Redirect</a>.
	I1219 06:05:52.186843 2059048 command_runner.go:130] > systemd 252 (252.39-1~deb12u1)
	I1219 06:05:52.186921 2059048 command_runner.go:130] > +PAM +AUDIT +SELINUX +APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT +QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified
	I1219 06:05:52.187021 2059048 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I1219 06:05:52.191424 2059048 command_runner.go:130] ! stat: cannot statx '/etc/cni/net.d/*loopback.conf*': No such file or directory
	W1219 06:05:52.191590 2059048 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1219 06:05:52.191669 2059048 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1219 06:05:52.199647 2059048 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1219 06:05:52.199671 2059048 start.go:496] detecting cgroup driver to use...
	I1219 06:05:52.199702 2059048 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1219 06:05:52.199771 2059048 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1219 06:05:52.215530 2059048 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1219 06:05:52.228927 2059048 docker.go:218] disabling cri-docker service (if available) ...
	I1219 06:05:52.229039 2059048 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1219 06:05:52.245166 2059048 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1219 06:05:52.258582 2059048 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1219 06:05:52.378045 2059048 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1219 06:05:52.513092 2059048 docker.go:234] disabling docker service ...
	I1219 06:05:52.513180 2059048 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1219 06:05:52.528704 2059048 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1219 06:05:52.542109 2059048 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1219 06:05:52.652456 2059048 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1219 06:05:52.767269 2059048 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1219 06:05:52.781039 2059048 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1219 06:05:52.797281 2059048 command_runner.go:130] > runtime-endpoint: unix:///run/containerd/containerd.sock
	I1219 06:05:52.797396 2059048 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1219 06:05:52.807020 2059048 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1219 06:05:52.816571 2059048 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1219 06:05:52.816661 2059048 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1219 06:05:52.826225 2059048 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1219 06:05:52.835109 2059048 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1219 06:05:52.843741 2059048 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1219 06:05:52.852504 2059048 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1219 06:05:52.860160 2059048 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1219 06:05:52.868883 2059048 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1219 06:05:52.877906 2059048 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1219 06:05:52.887403 2059048 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1219 06:05:52.894024 2059048 command_runner.go:130] > net.bridge.bridge-nf-call-iptables = 1
	I1219 06:05:52.894921 2059048 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1219 06:05:52.902164 2059048 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1219 06:05:53.021703 2059048 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1219 06:05:53.168216 2059048 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1219 06:05:53.168331 2059048 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1219 06:05:53.171951 2059048 command_runner.go:130] >   File: /run/containerd/containerd.sock
	I1219 06:05:53.172022 2059048 command_runner.go:130] >   Size: 0         	Blocks: 0          IO Block: 4096   socket
	I1219 06:05:53.172043 2059048 command_runner.go:130] > Device: 0,72	Inode: 1614        Links: 1
	I1219 06:05:53.172065 2059048 command_runner.go:130] > Access: (0660/srw-rw----)  Uid: (    0/    root)   Gid: (    0/    root)
	I1219 06:05:53.172084 2059048 command_runner.go:130] > Access: 2025-12-19 06:05:53.119867628 +0000
	I1219 06:05:53.172112 2059048 command_runner.go:130] > Modify: 2025-12-19 06:05:53.119867628 +0000
	I1219 06:05:53.172131 2059048 command_runner.go:130] > Change: 2025-12-19 06:05:53.119867628 +0000
	I1219 06:05:53.172148 2059048 command_runner.go:130] >  Birth: -
	I1219 06:05:53.172331 2059048 start.go:564] Will wait 60s for crictl version
	I1219 06:05:53.172432 2059048 ssh_runner.go:195] Run: which crictl
	I1219 06:05:53.175887 2059048 command_runner.go:130] > /usr/local/bin/crictl
	I1219 06:05:53.176199 2059048 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1219 06:05:53.203136 2059048 command_runner.go:130] > Version:  0.1.0
	I1219 06:05:53.203389 2059048 command_runner.go:130] > RuntimeName:  containerd
	I1219 06:05:53.203588 2059048 command_runner.go:130] > RuntimeVersion:  v2.2.0
	I1219 06:05:53.203784 2059048 command_runner.go:130] > RuntimeApiVersion:  v1
	I1219 06:05:53.207710 2059048 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.2.0
	RuntimeApiVersion:  v1
	I1219 06:05:53.207845 2059048 ssh_runner.go:195] Run: containerd --version
	I1219 06:05:53.235328 2059048 command_runner.go:130] > containerd containerd.io v2.2.0 1c4457e00facac03ce1d75f7b6777a7a851e5c41
	I1219 06:05:53.237219 2059048 ssh_runner.go:195] Run: containerd --version
	I1219 06:05:53.254490 2059048 command_runner.go:130] > containerd containerd.io v2.2.0 1c4457e00facac03ce1d75f7b6777a7a851e5c41
	I1219 06:05:53.262101 2059048 out.go:179] * Preparing Kubernetes v1.35.0-rc.1 on containerd 2.2.0 ...
	I1219 06:05:53.264978 2059048 cli_runner.go:164] Run: docker network inspect functional-006924 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1219 06:05:53.280549 2059048 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1219 06:05:53.284647 2059048 command_runner.go:130] > 192.168.49.1	host.minikube.internal
	I1219 06:05:53.284847 2059048 kubeadm.go:884] updating cluster {Name:functional-006924 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APIServerName:minikubeC
A APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false C
ustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1219 06:05:53.284979 2059048 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1219 06:05:53.285048 2059048 ssh_runner.go:195] Run: sudo crictl images --output json
	I1219 06:05:53.307306 2059048 command_runner.go:130] > {
	I1219 06:05:53.307331 2059048 command_runner.go:130] >   "images":  [
	I1219 06:05:53.307335 2059048 command_runner.go:130] >     {
	I1219 06:05:53.307345 2059048 command_runner.go:130] >       "id":  "sha256:b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c",
	I1219 06:05:53.307350 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.307356 2059048 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20250512-df8de77b"
	I1219 06:05:53.307360 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307365 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.307373 2059048 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a"
	I1219 06:05:53.307380 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307385 2059048 command_runner.go:130] >       "size":  "40636774",
	I1219 06:05:53.307391 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.307395 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.307402 2059048 command_runner.go:130] >     },
	I1219 06:05:53.307405 2059048 command_runner.go:130] >     {
	I1219 06:05:53.307417 2059048 command_runner.go:130] >       "id":  "sha256:ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6",
	I1219 06:05:53.307425 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.307431 2059048 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1219 06:05:53.307435 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307441 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.307450 2059048 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"
	I1219 06:05:53.307455 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307460 2059048 command_runner.go:130] >       "size":  "8034419",
	I1219 06:05:53.307463 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.307467 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.307470 2059048 command_runner.go:130] >     },
	I1219 06:05:53.307482 2059048 command_runner.go:130] >     {
	I1219 06:05:53.307492 2059048 command_runner.go:130] >       "id":  "sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf",
	I1219 06:05:53.307496 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.307501 2059048 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.13.1"
	I1219 06:05:53.307505 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307519 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.307528 2059048 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6"
	I1219 06:05:53.307534 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307538 2059048 command_runner.go:130] >       "size":  "21168808",
	I1219 06:05:53.307542 2059048 command_runner.go:130] >       "username":  "nonroot",
	I1219 06:05:53.307546 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.307549 2059048 command_runner.go:130] >     },
	I1219 06:05:53.307552 2059048 command_runner.go:130] >     {
	I1219 06:05:53.307559 2059048 command_runner.go:130] >       "id":  "sha256:271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57",
	I1219 06:05:53.307565 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.307570 2059048 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.6-0"
	I1219 06:05:53.307581 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307585 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.307592 2059048 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890"
	I1219 06:05:53.307598 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307602 2059048 command_runner.go:130] >       "size":  "21749640",
	I1219 06:05:53.307610 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.307614 2059048 command_runner.go:130] >         "value":  "0"
	I1219 06:05:53.307618 2059048 command_runner.go:130] >       },
	I1219 06:05:53.307622 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.307625 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.307631 2059048 command_runner.go:130] >     },
	I1219 06:05:53.307634 2059048 command_runner.go:130] >     {
	I1219 06:05:53.307641 2059048 command_runner.go:130] >       "id":  "sha256:3c6ba27e07aef16adb050828695bfe6206439147b9ade2a2a1777c276bf79a54",
	I1219 06:05:53.307647 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.307653 2059048 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.35.0-rc.1"
	I1219 06:05:53.307666 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307670 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.307682 2059048 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:58367b5c0428495c0c12411fa7a018f5d40fe57307b85d8935b1ed35706ff7ee"
	I1219 06:05:53.307689 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307693 2059048 command_runner.go:130] >       "size":  "24692223",
	I1219 06:05:53.307697 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.307708 2059048 command_runner.go:130] >         "value":  "0"
	I1219 06:05:53.307712 2059048 command_runner.go:130] >       },
	I1219 06:05:53.307716 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.307723 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.307726 2059048 command_runner.go:130] >     },
	I1219 06:05:53.307729 2059048 command_runner.go:130] >     {
	I1219 06:05:53.307736 2059048 command_runner.go:130] >       "id":  "sha256:a34b3483f25ba81aa72f3aeb607a8c756479e8497d8420acbcd2854162ebf84a",
	I1219 06:05:53.307742 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.307748 2059048 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.35.0-rc.1"
	I1219 06:05:53.307753 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307757 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.307765 2059048 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:57ab0f75f58d99f4be7bff7bdda015fcbf1b7c20e58ba2722c8c39f751dc8c98"
	I1219 06:05:53.307769 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307773 2059048 command_runner.go:130] >       "size":  "20672157",
	I1219 06:05:53.307779 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.307783 2059048 command_runner.go:130] >         "value":  "0"
	I1219 06:05:53.307788 2059048 command_runner.go:130] >       },
	I1219 06:05:53.307792 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.307796 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.307799 2059048 command_runner.go:130] >     },
	I1219 06:05:53.307802 2059048 command_runner.go:130] >     {
	I1219 06:05:53.307809 2059048 command_runner.go:130] >       "id":  "sha256:7e3acea3d87aa7ca234514e7f9c10450c7a7f87fc273fc9b5a220e2a2be1ce4e",
	I1219 06:05:53.307813 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.307821 2059048 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.35.0-rc.1"
	I1219 06:05:53.307826 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307830 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.307840 2059048 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:bdd1fa8b53558a2e1967379a36b085c93faf15581e5fa9f212baf679d89c5bb5"
	I1219 06:05:53.307845 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307849 2059048 command_runner.go:130] >       "size":  "22432301",
	I1219 06:05:53.307858 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.307864 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.307867 2059048 command_runner.go:130] >     },
	I1219 06:05:53.307870 2059048 command_runner.go:130] >     {
	I1219 06:05:53.307877 2059048 command_runner.go:130] >       "id":  "sha256:abca4d5226620be2218c3971464a1066651a743008c1db8720353446a4b7bbde",
	I1219 06:05:53.307884 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.307889 2059048 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.35.0-rc.1"
	I1219 06:05:53.307893 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307899 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.307907 2059048 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:8155e3db27c7081abfc8eb5da70820cfeaf0bba7449e45360e8220e670f417d3"
	I1219 06:05:53.307913 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307917 2059048 command_runner.go:130] >       "size":  "15405535",
	I1219 06:05:53.307921 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.307925 2059048 command_runner.go:130] >         "value":  "0"
	I1219 06:05:53.307928 2059048 command_runner.go:130] >       },
	I1219 06:05:53.307932 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.307939 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.307942 2059048 command_runner.go:130] >     },
	I1219 06:05:53.307948 2059048 command_runner.go:130] >     {
	I1219 06:05:53.307955 2059048 command_runner.go:130] >       "id":  "sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1219 06:05:53.307963 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.307967 2059048 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1219 06:05:53.307970 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307974 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.307982 2059048 command_runner.go:130] >         "registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c"
	I1219 06:05:53.307987 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307991 2059048 command_runner.go:130] >       "size":  "267939",
	I1219 06:05:53.307996 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.308000 2059048 command_runner.go:130] >         "value":  "65535"
	I1219 06:05:53.308004 2059048 command_runner.go:130] >       },
	I1219 06:05:53.308011 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.308015 2059048 command_runner.go:130] >       "pinned":  true
	I1219 06:05:53.308020 2059048 command_runner.go:130] >     }
	I1219 06:05:53.308027 2059048 command_runner.go:130] >   ]
	I1219 06:05:53.308030 2059048 command_runner.go:130] > }
	I1219 06:05:53.310449 2059048 containerd.go:627] all images are preloaded for containerd runtime.
	I1219 06:05:53.310472 2059048 containerd.go:534] Images already preloaded, skipping extraction
	I1219 06:05:53.310540 2059048 ssh_runner.go:195] Run: sudo crictl images --output json
	I1219 06:05:53.331271 2059048 command_runner.go:130] > {
	I1219 06:05:53.331288 2059048 command_runner.go:130] >   "images":  [
	I1219 06:05:53.331292 2059048 command_runner.go:130] >     {
	I1219 06:05:53.331304 2059048 command_runner.go:130] >       "id":  "sha256:b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c",
	I1219 06:05:53.331309 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.331314 2059048 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20250512-df8de77b"
	I1219 06:05:53.331318 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331322 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.331332 2059048 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a"
	I1219 06:05:53.331336 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331340 2059048 command_runner.go:130] >       "size":  "40636774",
	I1219 06:05:53.331350 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.331355 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.331358 2059048 command_runner.go:130] >     },
	I1219 06:05:53.331361 2059048 command_runner.go:130] >     {
	I1219 06:05:53.331369 2059048 command_runner.go:130] >       "id":  "sha256:ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6",
	I1219 06:05:53.331373 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.331378 2059048 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1219 06:05:53.331381 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331385 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.331393 2059048 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"
	I1219 06:05:53.331396 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331400 2059048 command_runner.go:130] >       "size":  "8034419",
	I1219 06:05:53.331404 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.331408 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.331411 2059048 command_runner.go:130] >     },
	I1219 06:05:53.331414 2059048 command_runner.go:130] >     {
	I1219 06:05:53.331421 2059048 command_runner.go:130] >       "id":  "sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf",
	I1219 06:05:53.331425 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.331430 2059048 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.13.1"
	I1219 06:05:53.331433 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331439 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.331447 2059048 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6"
	I1219 06:05:53.331451 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331454 2059048 command_runner.go:130] >       "size":  "21168808",
	I1219 06:05:53.331458 2059048 command_runner.go:130] >       "username":  "nonroot",
	I1219 06:05:53.331462 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.331466 2059048 command_runner.go:130] >     },
	I1219 06:05:53.331468 2059048 command_runner.go:130] >     {
	I1219 06:05:53.331475 2059048 command_runner.go:130] >       "id":  "sha256:271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57",
	I1219 06:05:53.331479 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.331484 2059048 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.6-0"
	I1219 06:05:53.331487 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331491 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.331502 2059048 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890"
	I1219 06:05:53.331506 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331510 2059048 command_runner.go:130] >       "size":  "21749640",
	I1219 06:05:53.331515 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.331519 2059048 command_runner.go:130] >         "value":  "0"
	I1219 06:05:53.331522 2059048 command_runner.go:130] >       },
	I1219 06:05:53.331526 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.331530 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.331533 2059048 command_runner.go:130] >     },
	I1219 06:05:53.331536 2059048 command_runner.go:130] >     {
	I1219 06:05:53.331543 2059048 command_runner.go:130] >       "id":  "sha256:3c6ba27e07aef16adb050828695bfe6206439147b9ade2a2a1777c276bf79a54",
	I1219 06:05:53.331547 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.331551 2059048 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.35.0-rc.1"
	I1219 06:05:53.331555 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331559 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.331566 2059048 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:58367b5c0428495c0c12411fa7a018f5d40fe57307b85d8935b1ed35706ff7ee"
	I1219 06:05:53.331569 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331573 2059048 command_runner.go:130] >       "size":  "24692223",
	I1219 06:05:53.331577 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.331585 2059048 command_runner.go:130] >         "value":  "0"
	I1219 06:05:53.331592 2059048 command_runner.go:130] >       },
	I1219 06:05:53.331596 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.331600 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.331603 2059048 command_runner.go:130] >     },
	I1219 06:05:53.331606 2059048 command_runner.go:130] >     {
	I1219 06:05:53.331613 2059048 command_runner.go:130] >       "id":  "sha256:a34b3483f25ba81aa72f3aeb607a8c756479e8497d8420acbcd2854162ebf84a",
	I1219 06:05:53.331617 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.331622 2059048 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.35.0-rc.1"
	I1219 06:05:53.331626 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331629 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.331638 2059048 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:57ab0f75f58d99f4be7bff7bdda015fcbf1b7c20e58ba2722c8c39f751dc8c98"
	I1219 06:05:53.331641 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331645 2059048 command_runner.go:130] >       "size":  "20672157",
	I1219 06:05:53.331652 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.331656 2059048 command_runner.go:130] >         "value":  "0"
	I1219 06:05:53.331659 2059048 command_runner.go:130] >       },
	I1219 06:05:53.331663 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.331666 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.331669 2059048 command_runner.go:130] >     },
	I1219 06:05:53.331672 2059048 command_runner.go:130] >     {
	I1219 06:05:53.331679 2059048 command_runner.go:130] >       "id":  "sha256:7e3acea3d87aa7ca234514e7f9c10450c7a7f87fc273fc9b5a220e2a2be1ce4e",
	I1219 06:05:53.331683 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.331688 2059048 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.35.0-rc.1"
	I1219 06:05:53.331691 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331695 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.331702 2059048 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:bdd1fa8b53558a2e1967379a36b085c93faf15581e5fa9f212baf679d89c5bb5"
	I1219 06:05:53.331705 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331709 2059048 command_runner.go:130] >       "size":  "22432301",
	I1219 06:05:53.331713 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.331717 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.331720 2059048 command_runner.go:130] >     },
	I1219 06:05:53.331723 2059048 command_runner.go:130] >     {
	I1219 06:05:53.331733 2059048 command_runner.go:130] >       "id":  "sha256:abca4d5226620be2218c3971464a1066651a743008c1db8720353446a4b7bbde",
	I1219 06:05:53.331737 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.331742 2059048 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.35.0-rc.1"
	I1219 06:05:53.331745 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331749 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.331757 2059048 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:8155e3db27c7081abfc8eb5da70820cfeaf0bba7449e45360e8220e670f417d3"
	I1219 06:05:53.331760 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331764 2059048 command_runner.go:130] >       "size":  "15405535",
	I1219 06:05:53.331767 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.331771 2059048 command_runner.go:130] >         "value":  "0"
	I1219 06:05:53.331774 2059048 command_runner.go:130] >       },
	I1219 06:05:53.331778 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.331782 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.331785 2059048 command_runner.go:130] >     },
	I1219 06:05:53.331792 2059048 command_runner.go:130] >     {
	I1219 06:05:53.331799 2059048 command_runner.go:130] >       "id":  "sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1219 06:05:53.331803 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.331807 2059048 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1219 06:05:53.331811 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331815 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.331822 2059048 command_runner.go:130] >         "registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c"
	I1219 06:05:53.331826 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331829 2059048 command_runner.go:130] >       "size":  "267939",
	I1219 06:05:53.331833 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.331837 2059048 command_runner.go:130] >         "value":  "65535"
	I1219 06:05:53.331841 2059048 command_runner.go:130] >       },
	I1219 06:05:53.331845 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.331849 2059048 command_runner.go:130] >       "pinned":  true
	I1219 06:05:53.331852 2059048 command_runner.go:130] >     }
	I1219 06:05:53.331855 2059048 command_runner.go:130] >   ]
	I1219 06:05:53.331858 2059048 command_runner.go:130] > }
	I1219 06:05:53.333541 2059048 containerd.go:627] all images are preloaded for containerd runtime.
	I1219 06:05:53.333565 2059048 cache_images.go:86] Images are preloaded, skipping loading
	I1219 06:05:53.333574 2059048 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-rc.1 containerd true true} ...
	I1219 06:05:53.333694 2059048 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-rc.1/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-006924 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1219 06:05:53.333773 2059048 ssh_runner.go:195] Run: sudo crictl --timeout=10s info
	I1219 06:05:53.354890 2059048 command_runner.go:130] > {
	I1219 06:05:53.354909 2059048 command_runner.go:130] >   "cniconfig": {
	I1219 06:05:53.354915 2059048 command_runner.go:130] >     "Networks": [
	I1219 06:05:53.354919 2059048 command_runner.go:130] >       {
	I1219 06:05:53.354926 2059048 command_runner.go:130] >         "Config": {
	I1219 06:05:53.354932 2059048 command_runner.go:130] >           "CNIVersion": "0.3.1",
	I1219 06:05:53.354937 2059048 command_runner.go:130] >           "Name": "cni-loopback",
	I1219 06:05:53.354941 2059048 command_runner.go:130] >           "Plugins": [
	I1219 06:05:53.354945 2059048 command_runner.go:130] >             {
	I1219 06:05:53.354949 2059048 command_runner.go:130] >               "Network": {
	I1219 06:05:53.354953 2059048 command_runner.go:130] >                 "ipam": {},
	I1219 06:05:53.354958 2059048 command_runner.go:130] >                 "type": "loopback"
	I1219 06:05:53.354962 2059048 command_runner.go:130] >               },
	I1219 06:05:53.354967 2059048 command_runner.go:130] >               "Source": "{\"type\":\"loopback\"}"
	I1219 06:05:53.354971 2059048 command_runner.go:130] >             }
	I1219 06:05:53.354975 2059048 command_runner.go:130] >           ],
	I1219 06:05:53.354988 2059048 command_runner.go:130] >           "Source": "{\n\"cniVersion\": \"0.3.1\",\n\"name\": \"cni-loopback\",\n\"plugins\": [{\n  \"type\": \"loopback\"\n}]\n}"
	I1219 06:05:53.354992 2059048 command_runner.go:130] >         },
	I1219 06:05:53.354997 2059048 command_runner.go:130] >         "IFName": "lo"
	I1219 06:05:53.355000 2059048 command_runner.go:130] >       }
	I1219 06:05:53.355003 2059048 command_runner.go:130] >     ],
	I1219 06:05:53.355007 2059048 command_runner.go:130] >     "PluginConfDir": "/etc/cni/net.d",
	I1219 06:05:53.355011 2059048 command_runner.go:130] >     "PluginDirs": [
	I1219 06:05:53.355015 2059048 command_runner.go:130] >       "/opt/cni/bin"
	I1219 06:05:53.355027 2059048 command_runner.go:130] >     ],
	I1219 06:05:53.355031 2059048 command_runner.go:130] >     "PluginMaxConfNum": 1,
	I1219 06:05:53.355036 2059048 command_runner.go:130] >     "Prefix": "eth"
	I1219 06:05:53.355039 2059048 command_runner.go:130] >   },
	I1219 06:05:53.355042 2059048 command_runner.go:130] >   "config": {
	I1219 06:05:53.355046 2059048 command_runner.go:130] >     "cdiSpecDirs": [
	I1219 06:05:53.355050 2059048 command_runner.go:130] >       "/etc/cdi",
	I1219 06:05:53.355059 2059048 command_runner.go:130] >       "/var/run/cdi"
	I1219 06:05:53.355062 2059048 command_runner.go:130] >     ],
	I1219 06:05:53.355066 2059048 command_runner.go:130] >     "cni": {
	I1219 06:05:53.355070 2059048 command_runner.go:130] >       "binDir": "",
	I1219 06:05:53.355073 2059048 command_runner.go:130] >       "binDirs": [
	I1219 06:05:53.355077 2059048 command_runner.go:130] >         "/opt/cni/bin"
	I1219 06:05:53.355080 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.355084 2059048 command_runner.go:130] >       "confDir": "/etc/cni/net.d",
	I1219 06:05:53.355088 2059048 command_runner.go:130] >       "confTemplate": "",
	I1219 06:05:53.355091 2059048 command_runner.go:130] >       "ipPref": "",
	I1219 06:05:53.355095 2059048 command_runner.go:130] >       "maxConfNum": 1,
	I1219 06:05:53.355099 2059048 command_runner.go:130] >       "setupSerially": false,
	I1219 06:05:53.355103 2059048 command_runner.go:130] >       "useInternalLoopback": false
	I1219 06:05:53.355106 2059048 command_runner.go:130] >     },
	I1219 06:05:53.355114 2059048 command_runner.go:130] >     "containerd": {
	I1219 06:05:53.355119 2059048 command_runner.go:130] >       "defaultRuntimeName": "runc",
	I1219 06:05:53.355123 2059048 command_runner.go:130] >       "ignoreBlockIONotEnabledErrors": false,
	I1219 06:05:53.355128 2059048 command_runner.go:130] >       "ignoreRdtNotEnabledErrors": false,
	I1219 06:05:53.355132 2059048 command_runner.go:130] >       "runtimes": {
	I1219 06:05:53.355136 2059048 command_runner.go:130] >         "runc": {
	I1219 06:05:53.355140 2059048 command_runner.go:130] >           "ContainerAnnotations": null,
	I1219 06:05:53.355145 2059048 command_runner.go:130] >           "PodAnnotations": null,
	I1219 06:05:53.355151 2059048 command_runner.go:130] >           "baseRuntimeSpec": "",
	I1219 06:05:53.355155 2059048 command_runner.go:130] >           "cgroupWritable": false,
	I1219 06:05:53.355159 2059048 command_runner.go:130] >           "cniConfDir": "",
	I1219 06:05:53.355163 2059048 command_runner.go:130] >           "cniMaxConfNum": 0,
	I1219 06:05:53.355167 2059048 command_runner.go:130] >           "io_type": "",
	I1219 06:05:53.355171 2059048 command_runner.go:130] >           "options": {
	I1219 06:05:53.355174 2059048 command_runner.go:130] >             "BinaryName": "",
	I1219 06:05:53.355179 2059048 command_runner.go:130] >             "CriuImagePath": "",
	I1219 06:05:53.355183 2059048 command_runner.go:130] >             "CriuWorkPath": "",
	I1219 06:05:53.355187 2059048 command_runner.go:130] >             "IoGid": 0,
	I1219 06:05:53.355190 2059048 command_runner.go:130] >             "IoUid": 0,
	I1219 06:05:53.355198 2059048 command_runner.go:130] >             "NoNewKeyring": false,
	I1219 06:05:53.355201 2059048 command_runner.go:130] >             "Root": "",
	I1219 06:05:53.355205 2059048 command_runner.go:130] >             "ShimCgroup": "",
	I1219 06:05:53.355210 2059048 command_runner.go:130] >             "SystemdCgroup": false
	I1219 06:05:53.355214 2059048 command_runner.go:130] >           },
	I1219 06:05:53.355219 2059048 command_runner.go:130] >           "privileged_without_host_devices": false,
	I1219 06:05:53.355225 2059048 command_runner.go:130] >           "privileged_without_host_devices_all_devices_allowed": false,
	I1219 06:05:53.355229 2059048 command_runner.go:130] >           "runtimePath": "",
	I1219 06:05:53.355233 2059048 command_runner.go:130] >           "runtimeType": "io.containerd.runc.v2",
	I1219 06:05:53.355238 2059048 command_runner.go:130] >           "sandboxer": "podsandbox",
	I1219 06:05:53.355242 2059048 command_runner.go:130] >           "snapshotter": ""
	I1219 06:05:53.355245 2059048 command_runner.go:130] >         }
	I1219 06:05:53.355248 2059048 command_runner.go:130] >       }
	I1219 06:05:53.355252 2059048 command_runner.go:130] >     },
	I1219 06:05:53.355262 2059048 command_runner.go:130] >     "containerdEndpoint": "/run/containerd/containerd.sock",
	I1219 06:05:53.355267 2059048 command_runner.go:130] >     "containerdRootDir": "/var/lib/containerd",
	I1219 06:05:53.355273 2059048 command_runner.go:130] >     "device_ownership_from_security_context": false,
	I1219 06:05:53.355277 2059048 command_runner.go:130] >     "disableApparmor": false,
	I1219 06:05:53.355282 2059048 command_runner.go:130] >     "disableHugetlbController": true,
	I1219 06:05:53.355286 2059048 command_runner.go:130] >     "disableProcMount": false,
	I1219 06:05:53.355290 2059048 command_runner.go:130] >     "drainExecSyncIOTimeout": "0s",
	I1219 06:05:53.355294 2059048 command_runner.go:130] >     "enableCDI": true,
	I1219 06:05:53.355298 2059048 command_runner.go:130] >     "enableSelinux": false,
	I1219 06:05:53.355302 2059048 command_runner.go:130] >     "enableUnprivilegedICMP": true,
	I1219 06:05:53.355306 2059048 command_runner.go:130] >     "enableUnprivilegedPorts": true,
	I1219 06:05:53.355311 2059048 command_runner.go:130] >     "ignoreDeprecationWarnings": null,
	I1219 06:05:53.355319 2059048 command_runner.go:130] >     "ignoreImageDefinedVolumes": false,
	I1219 06:05:53.355323 2059048 command_runner.go:130] >     "maxContainerLogLineSize": 16384,
	I1219 06:05:53.355328 2059048 command_runner.go:130] >     "netnsMountsUnderStateDir": false,
	I1219 06:05:53.355332 2059048 command_runner.go:130] >     "restrictOOMScoreAdj": false,
	I1219 06:05:53.355338 2059048 command_runner.go:130] >     "rootDir": "/var/lib/containerd/io.containerd.grpc.v1.cri",
	I1219 06:05:53.355342 2059048 command_runner.go:130] >     "selinuxCategoryRange": 1024,
	I1219 06:05:53.355347 2059048 command_runner.go:130] >     "stateDir": "/run/containerd/io.containerd.grpc.v1.cri",
	I1219 06:05:53.355357 2059048 command_runner.go:130] >     "tolerateMissingHugetlbController": true,
	I1219 06:05:53.355362 2059048 command_runner.go:130] >     "unsetSeccompProfile": ""
	I1219 06:05:53.355365 2059048 command_runner.go:130] >   },
	I1219 06:05:53.355369 2059048 command_runner.go:130] >   "features": {
	I1219 06:05:53.355373 2059048 command_runner.go:130] >     "supplemental_groups_policy": true
	I1219 06:05:53.355376 2059048 command_runner.go:130] >   },
	I1219 06:05:53.355379 2059048 command_runner.go:130] >   "golang": "go1.24.9",
	I1219 06:05:53.355389 2059048 command_runner.go:130] >   "lastCNILoadStatus": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1219 06:05:53.355399 2059048 command_runner.go:130] >   "lastCNILoadStatus.default": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1219 06:05:53.355402 2059048 command_runner.go:130] >   "runtimeHandlers": [
	I1219 06:05:53.355406 2059048 command_runner.go:130] >     {
	I1219 06:05:53.355409 2059048 command_runner.go:130] >       "features": {
	I1219 06:05:53.355414 2059048 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1219 06:05:53.355418 2059048 command_runner.go:130] >         "user_namespaces": true
	I1219 06:05:53.355421 2059048 command_runner.go:130] >       }
	I1219 06:05:53.355424 2059048 command_runner.go:130] >     },
	I1219 06:05:53.355427 2059048 command_runner.go:130] >     {
	I1219 06:05:53.355431 2059048 command_runner.go:130] >       "features": {
	I1219 06:05:53.355436 2059048 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1219 06:05:53.355440 2059048 command_runner.go:130] >         "user_namespaces": true
	I1219 06:05:53.355443 2059048 command_runner.go:130] >       },
	I1219 06:05:53.355447 2059048 command_runner.go:130] >       "name": "runc"
	I1219 06:05:53.355449 2059048 command_runner.go:130] >     }
	I1219 06:05:53.355452 2059048 command_runner.go:130] >   ],
	I1219 06:05:53.355456 2059048 command_runner.go:130] >   "status": {
	I1219 06:05:53.355460 2059048 command_runner.go:130] >     "conditions": [
	I1219 06:05:53.355463 2059048 command_runner.go:130] >       {
	I1219 06:05:53.355467 2059048 command_runner.go:130] >         "message": "",
	I1219 06:05:53.355471 2059048 command_runner.go:130] >         "reason": "",
	I1219 06:05:53.355475 2059048 command_runner.go:130] >         "status": true,
	I1219 06:05:53.355480 2059048 command_runner.go:130] >         "type": "RuntimeReady"
	I1219 06:05:53.355483 2059048 command_runner.go:130] >       },
	I1219 06:05:53.355486 2059048 command_runner.go:130] >       {
	I1219 06:05:53.355495 2059048 command_runner.go:130] >         "message": "Network plugin returns error: cni plugin not initialized",
	I1219 06:05:53.355500 2059048 command_runner.go:130] >         "reason": "NetworkPluginNotReady",
	I1219 06:05:53.355504 2059048 command_runner.go:130] >         "status": false,
	I1219 06:05:53.355508 2059048 command_runner.go:130] >         "type": "NetworkReady"
	I1219 06:05:53.355512 2059048 command_runner.go:130] >       },
	I1219 06:05:53.355515 2059048 command_runner.go:130] >       {
	I1219 06:05:53.355536 2059048 command_runner.go:130] >         "message": "{\"io.containerd.deprecation/cgroup-v1\":\"The support for cgroup v1 is deprecated since containerd v2.2 and will be removed by no later than May 2029. Upgrade the host to use cgroup v2.\"}",
	I1219 06:05:53.355541 2059048 command_runner.go:130] >         "reason": "ContainerdHasDeprecationWarnings",
	I1219 06:05:53.355547 2059048 command_runner.go:130] >         "status": false,
	I1219 06:05:53.355552 2059048 command_runner.go:130] >         "type": "ContainerdHasNoDeprecationWarnings"
	I1219 06:05:53.355555 2059048 command_runner.go:130] >       }
	I1219 06:05:53.355557 2059048 command_runner.go:130] >     ]
	I1219 06:05:53.355560 2059048 command_runner.go:130] >   }
	I1219 06:05:53.355563 2059048 command_runner.go:130] > }
	I1219 06:05:53.357747 2059048 cni.go:84] Creating CNI manager for ""
	I1219 06:05:53.357770 2059048 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1219 06:05:53.357795 2059048 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1219 06:05:53.357824 2059048 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-rc.1 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-006924 NodeName:functional-006924 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt Stati
cPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1219 06:05:53.357938 2059048 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-006924"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-rc.1
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1219 06:05:53.358021 2059048 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-rc.1
	I1219 06:05:53.365051 2059048 command_runner.go:130] > kubeadm
	I1219 06:05:53.365070 2059048 command_runner.go:130] > kubectl
	I1219 06:05:53.365074 2059048 command_runner.go:130] > kubelet
	I1219 06:05:53.366033 2059048 binaries.go:51] Found k8s binaries, skipping transfer
	I1219 06:05:53.366118 2059048 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1219 06:05:53.373810 2059048 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (326 bytes)
	I1219 06:05:53.386231 2059048 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (357 bytes)
	I1219 06:05:53.399156 2059048 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2235 bytes)
	I1219 06:05:53.411832 2059048 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1219 06:05:53.415476 2059048 command_runner.go:130] > 192.168.49.2	control-plane.minikube.internal
	I1219 06:05:53.415580 2059048 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1219 06:05:53.524736 2059048 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1219 06:05:53.900522 2059048 certs.go:69] Setting up /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924 for IP: 192.168.49.2
	I1219 06:05:53.900547 2059048 certs.go:195] generating shared ca certs ...
	I1219 06:05:53.900563 2059048 certs.go:227] acquiring lock for ca certs: {Name:mk382c71693ea4061363f97b153b21bf6cdf5f38 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1219 06:05:53.900702 2059048 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.key
	I1219 06:05:53.900780 2059048 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/proxy-client-ca.key
	I1219 06:05:53.900803 2059048 certs.go:257] generating profile certs ...
	I1219 06:05:53.900908 2059048 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.key
	I1219 06:05:53.900976 2059048 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.key.febe6fed
	I1219 06:05:53.901024 2059048 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/proxy-client.key
	I1219 06:05:53.901037 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I1219 06:05:53.901081 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I1219 06:05:53.901098 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I1219 06:05:53.901109 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I1219 06:05:53.901127 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I1219 06:05:53.901139 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I1219 06:05:53.901154 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I1219 06:05:53.901171 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I1219 06:05:53.901229 2059048 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/2000386.pem (1338 bytes)
	W1219 06:05:53.901264 2059048 certs.go:480] ignoring /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/2000386_empty.pem, impossibly tiny 0 bytes
	I1219 06:05:53.901277 2059048 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca-key.pem (1679 bytes)
	I1219 06:05:53.901306 2059048 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem (1078 bytes)
	I1219 06:05:53.901333 2059048 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/cert.pem (1123 bytes)
	I1219 06:05:53.901365 2059048 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/key.pem (1671 bytes)
	I1219 06:05:53.901418 2059048 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem (1708 bytes)
	I1219 06:05:53.901449 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:05:53.901465 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/2000386.pem -> /usr/share/ca-certificates/2000386.pem
	I1219 06:05:53.901481 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem -> /usr/share/ca-certificates/20003862.pem
	I1219 06:05:53.902039 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1219 06:05:53.926748 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1219 06:05:53.945718 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1219 06:05:53.964111 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I1219 06:05:53.984388 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1219 06:05:54.005796 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I1219 06:05:54.027058 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1219 06:05:54.045330 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I1219 06:05:54.062681 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1219 06:05:54.080390 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/2000386.pem --> /usr/share/ca-certificates/2000386.pem (1338 bytes)
	I1219 06:05:54.102399 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem --> /usr/share/ca-certificates/20003862.pem (1708 bytes)
	I1219 06:05:54.120580 2059048 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (722 bytes)
	I1219 06:05:54.133732 2059048 ssh_runner.go:195] Run: openssl version
	I1219 06:05:54.139799 2059048 command_runner.go:130] > OpenSSL 3.0.17 1 Jul 2025 (Library: OpenSSL 3.0.17 1 Jul 2025)
	I1219 06:05:54.140191 2059048 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/20003862.pem
	I1219 06:05:54.147812 2059048 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/20003862.pem /etc/ssl/certs/20003862.pem
	I1219 06:05:54.155315 2059048 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/20003862.pem
	I1219 06:05:54.159037 2059048 command_runner.go:130] > -rw-r--r-- 1 root root 1708 Dec 19 05:57 /usr/share/ca-certificates/20003862.pem
	I1219 06:05:54.159108 2059048 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 19 05:57 /usr/share/ca-certificates/20003862.pem
	I1219 06:05:54.159165 2059048 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/20003862.pem
	I1219 06:05:54.200029 2059048 command_runner.go:130] > 3ec20f2e
	I1219 06:05:54.200546 2059048 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1219 06:05:54.208733 2059048 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:05:54.216254 2059048 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1219 06:05:54.224240 2059048 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:05:54.228059 2059048 command_runner.go:130] > -rw-r--r-- 1 root root 1111 Dec 19 05:43 /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:05:54.228165 2059048 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 19 05:43 /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:05:54.228244 2059048 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:05:54.268794 2059048 command_runner.go:130] > b5213941
	I1219 06:05:54.269372 2059048 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1219 06:05:54.277054 2059048 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/2000386.pem
	I1219 06:05:54.284467 2059048 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/2000386.pem /etc/ssl/certs/2000386.pem
	I1219 06:05:54.291949 2059048 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/2000386.pem
	I1219 06:05:54.295750 2059048 command_runner.go:130] > -rw-r--r-- 1 root root 1338 Dec 19 05:57 /usr/share/ca-certificates/2000386.pem
	I1219 06:05:54.295798 2059048 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 19 05:57 /usr/share/ca-certificates/2000386.pem
	I1219 06:05:54.295849 2059048 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/2000386.pem
	I1219 06:05:54.341163 2059048 command_runner.go:130] > 51391683
	I1219 06:05:54.341782 2059048 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1219 06:05:54.349497 2059048 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1219 06:05:54.353229 2059048 command_runner.go:130] >   File: /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1219 06:05:54.353253 2059048 command_runner.go:130] >   Size: 1176      	Blocks: 8          IO Block: 4096   regular file
	I1219 06:05:54.353261 2059048 command_runner.go:130] > Device: 259,1	Inode: 1582667     Links: 1
	I1219 06:05:54.353268 2059048 command_runner.go:130] > Access: (0644/-rw-r--r--)  Uid: (    0/    root)   Gid: (    0/    root)
	I1219 06:05:54.353275 2059048 command_runner.go:130] > Access: 2025-12-19 06:01:47.245300782 +0000
	I1219 06:05:54.353281 2059048 command_runner.go:130] > Modify: 2025-12-19 05:57:42.198721757 +0000
	I1219 06:05:54.353286 2059048 command_runner.go:130] > Change: 2025-12-19 05:57:42.198721757 +0000
	I1219 06:05:54.353294 2059048 command_runner.go:130] >  Birth: 2025-12-19 05:57:42.198721757 +0000
	I1219 06:05:54.353372 2059048 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1219 06:05:54.398897 2059048 command_runner.go:130] > Certificate will not expire
	I1219 06:05:54.399374 2059048 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1219 06:05:54.440111 2059048 command_runner.go:130] > Certificate will not expire
	I1219 06:05:54.440565 2059048 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1219 06:05:54.481409 2059048 command_runner.go:130] > Certificate will not expire
	I1219 06:05:54.481968 2059048 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1219 06:05:54.522576 2059048 command_runner.go:130] > Certificate will not expire
	I1219 06:05:54.523020 2059048 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1219 06:05:54.563365 2059048 command_runner.go:130] > Certificate will not expire
	I1219 06:05:54.563892 2059048 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1219 06:05:54.604428 2059048 command_runner.go:130] > Certificate will not expire
	I1219 06:05:54.604920 2059048 kubeadm.go:401] StartCluster: {Name:functional-006924 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APIServerName:minikubeCA A
PIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false Cust
omQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1219 06:05:54.605002 2059048 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1219 06:05:54.605063 2059048 ssh_runner.go:195] Run: sudo -s eval "crictl --timeout=10s ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1219 06:05:54.631433 2059048 cri.go:92] found id: ""
	I1219 06:05:54.631512 2059048 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1219 06:05:54.638289 2059048 command_runner.go:130] > /var/lib/kubelet/config.yaml
	I1219 06:05:54.638353 2059048 command_runner.go:130] > /var/lib/kubelet/kubeadm-flags.env
	I1219 06:05:54.638374 2059048 command_runner.go:130] > /var/lib/minikube/etcd:
	I1219 06:05:54.639191 2059048 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1219 06:05:54.639207 2059048 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1219 06:05:54.639278 2059048 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1219 06:05:54.646289 2059048 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1219 06:05:54.646704 2059048 kubeconfig.go:47] verify endpoint returned: get endpoint: "functional-006924" does not appear in /home/jenkins/minikube-integration/22230-1998525/kubeconfig
	I1219 06:05:54.646809 2059048 kubeconfig.go:62] /home/jenkins/minikube-integration/22230-1998525/kubeconfig needs updating (will repair): [kubeconfig missing "functional-006924" cluster setting kubeconfig missing "functional-006924" context setting]
	I1219 06:05:54.647118 2059048 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22230-1998525/kubeconfig: {Name:mk7db1732c7d76f01100426cb283dc7515a3b9ea Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1219 06:05:54.647542 2059048 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/22230-1998525/kubeconfig
	I1219 06:05:54.647700 2059048 kapi.go:59] client config for functional-006924: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.crt", KeyFile:"/home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.key", CAFile:"/home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(ni
l), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1ffe230), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1219 06:05:54.648289 2059048 envvar.go:172] "Feature gate default state" feature="ClientsPreferCBOR" enabled=false
	I1219 06:05:54.648312 2059048 envvar.go:172] "Feature gate default state" feature="InOrderInformers" enabled=true
	I1219 06:05:54.648318 2059048 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I1219 06:05:54.648377 2059048 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I1219 06:05:54.648389 2059048 envvar.go:172] "Feature gate default state" feature="ClientsAllowCBOR" enabled=false
	I1219 06:05:54.648357 2059048 cert_rotation.go:141] "Starting client certificate rotation controller" logger="tls-transport-cache"
	I1219 06:05:54.648779 2059048 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1219 06:05:54.659696 2059048 kubeadm.go:635] The running cluster does not require reconfiguration: 192.168.49.2
	I1219 06:05:54.659739 2059048 kubeadm.go:602] duration metric: took 20.517186ms to restartPrimaryControlPlane
	I1219 06:05:54.659750 2059048 kubeadm.go:403] duration metric: took 54.838405ms to StartCluster
	I1219 06:05:54.659766 2059048 settings.go:142] acquiring lock: {Name:mk0fb518a1861caea9ce90c087e9f98ff93c6842 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1219 06:05:54.659859 2059048 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/22230-1998525/kubeconfig
	I1219 06:05:54.660602 2059048 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22230-1998525/kubeconfig: {Name:mk7db1732c7d76f01100426cb283dc7515a3b9ea Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1219 06:05:54.660878 2059048 start.go:236] Will wait 6m0s for node &{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I1219 06:05:54.661080 2059048 config.go:182] Loaded profile config "functional-006924": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1219 06:05:54.661197 2059048 addons.go:543] enable addons start: toEnable=map[ambassador:false amd-gpu-device-plugin:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I1219 06:05:54.661465 2059048 addons.go:70] Setting storage-provisioner=true in profile "functional-006924"
	I1219 06:05:54.661481 2059048 addons.go:239] Setting addon storage-provisioner=true in "functional-006924"
	I1219 06:05:54.661506 2059048 host.go:66] Checking if "functional-006924" exists ...
	I1219 06:05:54.661954 2059048 cli_runner.go:164] Run: docker container inspect functional-006924 --format={{.State.Status}}
	I1219 06:05:54.662128 2059048 addons.go:70] Setting default-storageclass=true in profile "functional-006924"
	I1219 06:05:54.662158 2059048 addons_storage_classes.go:34] enableOrDisableStorageClasses default-storageclass=true on "functional-006924"
	I1219 06:05:54.662427 2059048 cli_runner.go:164] Run: docker container inspect functional-006924 --format={{.State.Status}}
	I1219 06:05:54.667300 2059048 out.go:179] * Verifying Kubernetes components...
	I1219 06:05:54.673650 2059048 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1219 06:05:54.689683 2059048 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/22230-1998525/kubeconfig
	I1219 06:05:54.689848 2059048 kapi.go:59] client config for functional-006924: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.crt", KeyFile:"/home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.key", CAFile:"/home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(ni
l), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1ffe230), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1219 06:05:54.690123 2059048 addons.go:239] Setting addon default-storageclass=true in "functional-006924"
	I1219 06:05:54.690152 2059048 host.go:66] Checking if "functional-006924" exists ...
	I1219 06:05:54.690560 2059048 cli_runner.go:164] Run: docker container inspect functional-006924 --format={{.State.Status}}
	I1219 06:05:54.715008 2059048 out.go:179]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I1219 06:05:54.717850 2059048 addons.go:436] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:05:54.717879 2059048 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I1219 06:05:54.717946 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:54.734767 2059048 addons.go:436] installing /etc/kubernetes/addons/storageclass.yaml
	I1219 06:05:54.734788 2059048 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I1219 06:05:54.734856 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:54.764236 2059048 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:05:54.773070 2059048 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:05:54.876977 2059048 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1219 06:05:54.898675 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:05:54.923995 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:05:55.652544 2059048 node_ready.go:35] waiting up to 6m0s for node "functional-006924" to be "Ready" ...
	I1219 06:05:55.652680 2059048 type.go:165] "Request Body" body=""
	I1219 06:05:55.652777 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:05:55.653088 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:55.653133 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:55.653174 2059048 retry.go:31] will retry after 152.748ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:55.653242 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:55.653274 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:55.653290 2059048 retry.go:31] will retry after 222.401366ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:55.653368 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:05:55.806850 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:05:55.871164 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:55.871241 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:55.871268 2059048 retry.go:31] will retry after 248.166368ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:55.876351 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:05:55.932419 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:55.936105 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:55.936137 2059048 retry.go:31] will retry after 191.546131ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:56.120512 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:05:56.128049 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:05:56.153420 2059048 type.go:165] "Request Body" body=""
	I1219 06:05:56.153544 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:05:56.153844 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:05:56.188805 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:56.192400 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:56.192475 2059048 retry.go:31] will retry after 421.141509ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:56.203130 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:56.203228 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:56.203252 2059048 retry.go:31] will retry after 495.708783ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:56.614800 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:05:56.653236 2059048 type.go:165] "Request Body" body=""
	I1219 06:05:56.653361 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:05:56.653708 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:05:56.677894 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:56.677943 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:56.677993 2059048 retry.go:31] will retry after 980.857907ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:56.700099 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:05:56.755124 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:56.758623 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:56.758652 2059048 retry.go:31] will retry after 1.143622688s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:57.152911 2059048 type.go:165] "Request Body" body=""
	I1219 06:05:57.153042 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:05:57.153399 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:05:57.652868 2059048 type.go:165] "Request Body" body=""
	I1219 06:05:57.652947 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:05:57.653291 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:05:57.653378 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:05:57.659518 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:05:57.724667 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:57.724716 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:57.724735 2059048 retry.go:31] will retry after 900.329628ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:57.903067 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:05:57.986230 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:57.986314 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:57.986340 2059048 retry.go:31] will retry after 1.7845791s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:58.153671 2059048 type.go:165] "Request Body" body=""
	I1219 06:05:58.153749 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:05:58.154120 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:05:58.625732 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:05:58.653113 2059048 type.go:165] "Request Body" body=""
	I1219 06:05:58.653187 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:05:58.653475 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:05:58.712944 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:58.713042 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:58.713071 2059048 retry.go:31] will retry after 2.322946675s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:59.153740 2059048 type.go:165] "Request Body" body=""
	I1219 06:05:59.153822 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:05:59.154186 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:05:59.652843 2059048 type.go:165] "Request Body" body=""
	I1219 06:05:59.652927 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:05:59.653311 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:05:59.771577 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:05:59.835749 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:59.839442 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:59.839476 2059048 retry.go:31] will retry after 2.412907222s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:00.152821 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:00.152949 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:00.153306 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:00.153393 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:00.653320 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:00.653404 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:00.653734 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:01.036322 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:06:01.102362 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:01.106179 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:01.106214 2059048 retry.go:31] will retry after 2.139899672s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:01.153490 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:01.153572 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:01.153855 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:01.653656 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:01.653732 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:01.654026 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:02.152793 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:02.152870 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:02.153204 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:02.252582 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:06:02.312437 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:02.312479 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:02.312500 2059048 retry.go:31] will retry after 1.566668648s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:02.652881 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:02.652958 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:02.653230 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:02.653283 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:03.152957 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:03.153054 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:03.153393 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:03.246844 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:06:03.302237 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:03.305728 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:03.305771 2059048 retry.go:31] will retry after 6.170177016s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:03.653408 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:03.653482 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:03.653834 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:03.880237 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:06:03.939688 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:03.939736 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:03.939756 2059048 retry.go:31] will retry after 4.919693289s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:04.153025 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:04.153101 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:04.153368 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:04.653333 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:04.653405 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:04.653716 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:04.653762 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:05.153589 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:05.153680 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:05.154012 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:05.652866 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:05.652946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:05.653271 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:06.152846 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:06.152922 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:06.153277 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:06.652913 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:06.652987 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:06.653350 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:07.152875 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:07.152947 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:07.153248 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:07.153305 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:07.652846 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:07.652924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:07.653261 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:08.152864 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:08.152938 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:08.153236 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:08.652856 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:08.652924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:08.653179 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:08.859603 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:06:08.923746 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:08.923802 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:08.923824 2059048 retry.go:31] will retry after 7.49455239s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:09.153273 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:09.153361 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:09.153733 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:09.153794 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:09.476166 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:06:09.536340 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:09.536378 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:09.536397 2059048 retry.go:31] will retry after 3.264542795s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:09.652787 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:09.652863 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:09.653191 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:10.152879 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:10.152955 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:10.153217 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:10.653092 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:10.653172 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:10.653505 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:11.153189 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:11.153267 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:11.153564 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:11.653360 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:11.653432 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:11.653748 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:11.653809 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:12.153584 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:12.153667 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:12.154066 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:12.652816 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:12.652897 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:12.653232 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:12.801732 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:06:12.858668 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:12.858722 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:12.858742 2059048 retry.go:31] will retry after 7.015856992s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:13.152872 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:13.152946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:13.153207 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:13.652838 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:13.652915 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:13.653206 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:14.152876 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:14.152957 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:14.153277 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:14.153340 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:14.653224 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:14.653299 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:14.653566 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:15.153381 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:15.153458 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:15.153856 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:15.653715 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:15.653796 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:15.654137 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:16.153469 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:16.153543 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:16.153826 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:16.153868 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:16.419404 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:06:16.476671 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:16.480081 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:16.480119 2059048 retry.go:31] will retry after 7.9937579s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:16.653575 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:16.653716 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:16.653985 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:17.153751 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:17.153850 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:17.154134 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:17.652872 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:17.652939 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:17.653201 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:18.152893 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:18.152976 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:18.153301 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:18.652860 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:18.652932 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:18.653233 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:18.653289 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:19.152869 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:19.152958 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:19.153227 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:19.652934 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:19.653010 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:19.653354 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:19.875781 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:06:19.950537 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:19.954067 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:19.954097 2059048 retry.go:31] will retry after 12.496952157s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:20.153579 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:20.153656 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:20.154027 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:20.652751 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:20.652846 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:20.653178 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:21.153037 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:21.153112 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:21.153446 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:21.153504 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:21.652852 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:21.652937 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:21.653238 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:22.152882 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:22.152951 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:22.153207 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:22.652851 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:22.652925 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:22.653261 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:23.152820 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:23.152899 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:23.153240 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:23.652818 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:23.652892 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:23.653158 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:23.653200 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:24.152783 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:24.152857 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:24.153198 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:24.474774 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:06:24.538538 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:24.538585 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:24.538605 2059048 retry.go:31] will retry after 14.635173495s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:24.653139 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:24.653215 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:24.653538 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:25.153284 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:25.153354 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:25.153661 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:25.653607 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:25.653689 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:25.653986 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:25.654040 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:26.152728 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:26.152857 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:26.153186 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:26.652777 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:26.652852 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:26.653175 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:27.152839 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:27.152916 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:27.153210 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:27.652846 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:27.652930 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:27.653284 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:28.152874 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:28.152956 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:28.153232 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:28.153286 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:28.652849 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:28.652924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:28.653242 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:29.152838 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:29.152930 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:29.153299 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:29.652870 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:29.652945 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:29.653242 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:30.152853 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:30.152937 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:30.153290 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:30.153348 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:30.653022 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:30.653115 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:30.653416 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:31.152873 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:31.152960 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:31.153275 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:31.652985 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:31.653057 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:31.653405 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:32.152845 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:32.152921 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:32.153253 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:32.451758 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:06:32.506473 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:32.509966 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:32.509998 2059048 retry.go:31] will retry after 31.028140902s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:32.653234 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:32.653309 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:32.653632 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:32.653749 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:33.153497 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:33.153583 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:33.153949 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:33.652732 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:33.652832 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:33.653182 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:34.152891 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:34.152964 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:34.153263 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:34.653098 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:34.653173 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:34.653525 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:35.153363 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:35.153489 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:35.153845 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:35.153907 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:35.653568 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:35.653649 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:35.653928 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:36.153725 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:36.153800 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:36.154115 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:36.652850 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:36.652933 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:36.653274 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:37.153419 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:37.153492 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:37.153866 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:37.153952 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:37.653726 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:37.653797 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:37.654143 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:38.152858 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:38.152933 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:38.153263 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:38.652866 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:38.652935 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:38.653195 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:39.152856 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:39.152932 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:39.153265 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:39.174643 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:06:39.239291 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:39.239335 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:39.239354 2059048 retry.go:31] will retry after 15.420333699s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:39.652870 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:39.652944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:39.653261 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:39.653316 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:40.152873 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:40.152963 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:40.153285 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:40.653056 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:40.653131 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:40.653494 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:41.153188 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:41.153263 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:41.153588 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:41.652865 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:41.652932 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:41.653248 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:42.152875 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:42.152964 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:42.153379 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:42.153461 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:42.652919 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:42.653000 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:42.653314 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:43.152873 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:43.152946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:43.153240 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:43.652956 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:43.653027 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:43.653379 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:44.152963 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:44.153044 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:44.153381 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:44.653201 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:44.653284 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:44.653550 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:44.653592 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:45.153794 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:45.153882 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:45.154325 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:45.653107 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:45.653190 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:45.653497 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:46.152858 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:46.152925 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:46.153184 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:46.652851 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:46.652926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:46.653256 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:47.152863 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:47.152944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:47.153246 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:47.153293 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:47.652875 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:47.652964 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:47.653331 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:48.152820 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:48.152904 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:48.153254 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:48.652976 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:48.653054 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:48.653401 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:49.152928 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:49.153003 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:49.153264 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:49.652842 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:49.652922 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:49.653266 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:49.653325 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:50.152835 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:50.152916 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:50.153230 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:50.653021 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:50.653097 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:50.653360 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:51.152819 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:51.152892 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:51.153216 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:51.652938 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:51.653021 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:51.653350 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:51.653404 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:52.152878 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:52.152997 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:52.153340 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:52.653054 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:52.653126 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:52.653428 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:53.152809 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:53.152885 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:53.153202 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:53.652872 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:53.652946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:53.653212 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:54.152921 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:54.153000 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:54.153307 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:54.153361 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:54.653430 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:54.653504 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:54.653886 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:54.660097 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:06:54.724740 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:54.724806 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:54.724824 2059048 retry.go:31] will retry after 21.489743806s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:55.153047 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:55.153170 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:55.153542 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:55.653137 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:55.653210 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:55.653500 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:56.153216 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:56.153285 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:56.153620 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:56.153682 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:56.653420 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:56.653501 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:56.653832 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:57.153605 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:57.153702 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:57.154020 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:57.652746 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:57.652846 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:57.653184 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:58.152882 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:58.152958 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:58.153278 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:58.652843 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:58.652924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:58.653216 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:58.653262 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:59.152798 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:59.152874 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:59.153218 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:59.652867 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:59.652934 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:59.653193 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:00.155125 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:00.155210 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:00.156183 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:00.653343 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:00.653416 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:00.653737 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:00.653787 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:01.152970 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:01.153062 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:01.153389 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:01.652835 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:01.652908 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:01.653245 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:02.152861 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:02.152952 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:02.153330 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:02.652900 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:02.652986 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:02.653368 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:03.152826 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:03.152908 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:03.153251 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:03.153306 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:03.538820 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:07:03.598261 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:07:03.602187 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:07:03.602221 2059048 retry.go:31] will retry after 27.693032791s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:07:03.653410 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:03.653486 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:03.653840 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:04.153298 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:04.153371 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:04.153670 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:04.653539 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:04.653618 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:04.653956 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:05.153749 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:05.153837 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:05.154160 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:05.154219 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:05.653149 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:05.653217 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:05.653546 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:06.153378 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:06.153468 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:06.153799 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:06.653420 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:06.653494 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:06.653803 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:07.153113 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:07.153187 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:07.153451 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:07.652897 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:07.652979 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:07.653292 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:07.653351 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:08.152825 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:08.152901 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:08.153274 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:08.652859 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:08.652926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:08.653238 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:09.153667 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:09.153756 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:09.154076 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:09.652824 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:09.652899 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:09.653232 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:10.153341 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:10.153410 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:10.153757 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:10.153818 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:10.653710 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:10.653802 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:10.654164 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:11.152777 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:11.152862 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:11.153199 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:11.652877 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:11.652946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:11.653219 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:12.152831 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:12.152908 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:12.153224 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:12.652832 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:12.652911 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:12.653226 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:12.653273 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:13.152877 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:13.152951 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:13.153279 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:13.652822 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:13.652904 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:13.653241 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:14.152814 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:14.152897 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:14.153218 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:14.653177 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:14.653250 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:14.653558 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:14.653611 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:15.153356 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:15.153436 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:15.153788 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:15.652725 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:15.652816 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:15.653161 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:16.152875 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:16.152948 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:16.153206 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:16.215537 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:07:16.273841 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:07:16.273881 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:07:16.273899 2059048 retry.go:31] will retry after 30.872906877s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:07:16.653514 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:16.653598 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:16.653919 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:16.653970 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:17.153579 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:17.153656 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:17.153994 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:17.653597 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:17.653665 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:17.653945 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:18.152782 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:18.152859 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:18.153155 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:18.652836 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:18.652926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:18.653269 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:19.152861 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:19.152939 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:19.153239 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:19.153292 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:19.652841 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:19.652916 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:19.653250 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:20.152836 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:20.152910 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:20.153258 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:20.653266 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:20.653354 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:20.653711 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:21.153511 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:21.153585 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:21.153886 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:21.153948 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:21.653690 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:21.653776 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:21.654081 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:22.153312 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:22.153387 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:22.153749 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:22.653581 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:22.653661 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:22.654117 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:23.152715 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:23.152802 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:23.153141 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:23.652867 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:23.652936 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:23.653196 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:23.653241 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:24.152848 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:24.152934 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:24.153277 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:24.653127 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:24.653203 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:24.653560 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:25.153321 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:25.153393 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:25.153662 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:25.652744 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:25.652855 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:25.653239 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:25.653298 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:26.152968 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:26.153049 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:26.153397 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:26.652880 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:26.652963 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:26.653275 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:27.152811 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:27.152888 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:27.153207 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:27.652936 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:27.653013 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:27.653346 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:27.653407 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:28.152886 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:28.152961 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:28.153298 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:28.652829 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:28.652904 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:28.653240 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:29.152818 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:29.152890 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:29.153229 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:29.652862 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:29.652947 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:29.653200 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:30.152832 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:30.152918 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:30.153263 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:30.153321 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:30.652996 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:30.653069 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:30.653387 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:31.152872 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:31.152950 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:31.153277 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:31.295743 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:07:31.365905 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:07:31.365953 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:07:31.366070 2059048 out.go:285] ! Enabling 'default-storageclass' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1219 06:07:31.653338 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:31.653413 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:31.653757 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:32.153415 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:32.153519 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:32.153862 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:32.153934 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:32.653181 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:32.653249 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:32.653512 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:33.152815 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:33.152899 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:33.153193 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:33.652817 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:33.652892 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:33.653195 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:34.152859 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:34.152941 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:34.153251 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:34.653155 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:34.653231 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:34.653574 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:34.653631 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:35.153386 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:35.153461 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:35.153800 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:35.652767 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:35.652837 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:35.653104 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:36.152871 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:36.152949 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:36.153263 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:36.652848 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:36.652927 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:36.653291 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:37.152893 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:37.152978 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:37.153238 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:37.153278 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:37.652927 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:37.653002 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:37.653295 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:38.152985 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:38.153059 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:38.153404 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:38.652889 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:38.652964 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:38.653220 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:39.152836 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:39.152920 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:39.153233 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:39.153294 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:39.652812 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:39.652889 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:39.653215 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:40.152857 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:40.152932 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:40.153187 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:40.653073 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:40.653148 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:40.653479 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:41.152804 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:41.152885 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:41.153180 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:41.652771 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:41.652841 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:41.653154 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:41.653206 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:42.152884 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:42.152963 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:42.153327 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:42.652853 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:42.652951 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:42.653271 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:43.152861 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:43.152931 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:43.153232 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:43.652842 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:43.652918 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:43.653242 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:43.653314 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:44.152994 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:44.153073 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:44.153402 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:44.653413 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:44.653502 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:44.653799 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:45.153668 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:45.153801 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:45.154199 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:45.653080 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:45.653158 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:45.653481 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:45.653538 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:46.153253 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:46.153372 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:46.153620 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:46.653410 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:46.653505 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:46.653901 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:47.147624 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:07:47.153238 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:47.153313 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:47.153618 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:47.207245 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:07:47.207289 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:07:47.207381 2059048 out.go:285] ! Enabling 'storage-provisioner' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1219 06:07:47.212201 2059048 out.go:179] * Enabled addons: 
	I1219 06:07:47.215092 2059048 addons.go:546] duration metric: took 1m52.553895373s for enable addons: enabled=[]
	I1219 06:07:47.652840 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:47.652917 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:47.653177 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:48.152861 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:48.152936 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:48.153274 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:48.153336 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:48.652862 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:48.652941 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:48.653266 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:49.152877 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:49.152950 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:49.153222 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:49.652844 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:49.652940 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:49.653312 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:50.152823 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:50.152906 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:50.153448 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:50.153518 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:50.653031 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:50.653101 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:50.653362 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:51.153113 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:51.153194 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:51.153608 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:51.653414 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:51.653487 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:51.653829 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:52.153249 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:52.153337 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:52.153602 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:52.153645 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:52.653349 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:52.653422 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:52.653735 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:53.153542 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:53.153620 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:53.153960 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:53.653712 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:53.653793 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:53.654088 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:54.153153 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:54.153246 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:54.153650 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:54.153714 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:54.653597 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:54.653675 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:54.654059 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:55.153390 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:55.153470 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:55.153780 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:55.652892 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:55.652968 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:55.653343 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:56.153064 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:56.153144 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:56.153504 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:56.652934 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:56.653001 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:56.653305 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:56.653374 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:57.152823 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:57.152898 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:57.153265 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:57.652979 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:57.653054 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:57.653394 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:58.152861 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:58.152935 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:58.153224 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:58.652841 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:58.652916 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:58.653243 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:59.152863 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:59.152942 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:59.153252 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:59.153305 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:59.652871 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:59.652957 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:59.653221 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:00.152926 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:00.153011 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:00.153341 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:00.653583 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:00.653664 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:00.654050 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:01.153430 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:01.153505 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:01.153842 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:01.153899 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:01.653658 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:01.653734 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:01.654077 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:02.152810 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:02.152894 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:02.153236 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:02.652906 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:02.652992 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:02.653255 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:03.152814 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:03.152911 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:03.153241 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:03.652846 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:03.652946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:03.653311 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:03.653372 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:04.152873 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:04.152947 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:04.153272 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:04.653311 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:04.653393 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:04.653691 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:05.153373 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:05.153449 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:05.153786 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:05.653505 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:05.653577 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:05.653866 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:05.653911 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:06.153684 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:06.153763 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:06.154116 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:06.652849 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:06.652928 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:06.653265 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:07.152801 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:07.152875 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:07.153140 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:07.652821 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:07.652904 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:07.653239 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:08.152989 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:08.153069 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:08.153365 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:08.153412 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:08.652920 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:08.652987 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:08.653255 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:09.152797 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:09.152875 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:09.153215 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:09.652928 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:09.653026 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:09.653367 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:10.152861 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:10.152928 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:10.153183 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:10.653055 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:10.653153 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:10.653481 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:10.653536 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:11.152864 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:11.152945 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:11.153256 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:11.652878 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:11.652952 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:11.653279 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:12.152810 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:12.152885 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:12.153241 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:12.652960 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:12.653057 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:12.653342 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:13.152864 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:13.152943 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:13.153205 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:13.153250 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:13.652804 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:13.652895 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:13.653245 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:14.152963 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:14.153048 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:14.153407 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:14.653382 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:14.653457 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:14.653810 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:15.153570 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:15.153650 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:15.153993 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:15.154055 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:15.652797 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:15.652875 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:15.653205 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:16.152855 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:16.152927 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:16.153181 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:16.652842 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:16.652938 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:16.653237 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:17.152851 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:17.152930 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:17.153255 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:17.652875 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:17.652952 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:17.653273 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:17.653327 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:18.152822 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:18.152897 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:18.153211 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:18.652850 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:18.652929 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:18.653226 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:19.152862 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:19.152939 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:19.153206 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:19.652841 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:19.652943 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:19.653281 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:20.152968 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:20.153059 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:20.153353 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:20.153402 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:20.652914 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:20.652990 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:20.653265 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:21.152823 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:21.152899 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:21.153199 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:21.652903 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:21.652980 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:21.653286 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:22.152864 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:22.152936 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:22.153212 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:22.652848 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:22.652926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:22.653257 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:22.653311 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:23.152987 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:23.153082 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:23.153450 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:23.652860 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:23.652933 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:23.653264 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:24.152875 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:24.152959 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:24.153259 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:24.653188 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:24.653259 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:24.653621 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:24.653676 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:25.152855 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:25.152925 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:25.153189 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:25.653096 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:25.653176 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:25.653514 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:26.153303 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:26.153380 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:26.153718 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:26.653504 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:26.653583 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:26.653866 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:26.653917 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:27.153641 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:27.153723 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:27.154070 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:27.653768 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:27.653851 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:27.654198 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:28.152880 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:28.152951 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:28.153263 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:28.652865 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:28.652944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:28.653286 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:29.152996 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:29.153076 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:29.153423 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:29.153485 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:29.652870 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:29.652937 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:29.653208 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:30.152848 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:30.152931 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:30.153247 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:30.653099 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:30.653178 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:30.653543 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:31.152864 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:31.152933 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:31.153184 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:31.652836 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:31.652916 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:31.653254 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:31.653310 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:32.152845 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:32.152923 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:32.153234 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:32.652869 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:32.652974 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:32.653291 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:33.152813 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:33.152892 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:33.153182 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:33.652873 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:33.652952 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:33.653279 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:33.653339 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:34.152888 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:34.152957 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:34.153206 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:34.653221 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:34.653303 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:34.653662 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:35.153491 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:35.153567 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:35.153923 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:35.653686 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:35.653756 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:35.654034 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:35.654075 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:36.152742 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:36.152852 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:36.153178 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:36.652917 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:36.652991 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:36.653328 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:37.152872 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:37.152944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:37.153269 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:37.652832 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:37.652905 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:37.653225 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:38.152823 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:38.152901 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:38.153256 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:38.153311 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:38.652878 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:38.652945 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:38.653254 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:39.152844 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:39.152917 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:39.153253 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:39.652865 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:39.652948 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:39.653281 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:40.152846 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:40.152936 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:40.153201 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:40.653085 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:40.653160 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:40.653488 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:40.653543 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:41.152787 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:41.152870 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:41.153181 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:41.652770 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:41.652846 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:41.653122 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:42.152887 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:42.152974 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:42.153376 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:42.653103 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:42.653188 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:42.653511 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:42.653570 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:43.152867 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:43.152944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:43.153205 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:43.652856 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:43.652941 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:43.653311 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:44.153027 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:44.153105 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:44.153433 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:44.653459 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:44.653530 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:44.653788 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:44.653840 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:45.153678 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:45.153766 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:45.156105 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=2
	I1219 06:08:45.653114 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:45.653196 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:45.653533 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:46.152871 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:46.152948 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:46.153207 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:46.652877 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:46.652950 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:46.653258 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:47.152995 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:47.153106 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:47.153459 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:47.153515 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:47.652878 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:47.652955 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:47.653273 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:48.152852 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:48.152926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:48.153282 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:48.652874 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:48.652954 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:48.653318 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:49.152869 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:49.152946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:49.153202 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:49.652841 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:49.652924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:49.653230 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:49.653282 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:50.152954 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:50.153027 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:50.153317 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:50.653024 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:50.653102 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:50.653365 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:51.152850 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:51.152924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:51.153219 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:51.652848 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:51.652934 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:51.653275 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:51.653343 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:52.152878 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:52.152957 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:52.153276 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:52.652974 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:52.653059 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:52.653395 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:53.153096 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:53.153174 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:53.153508 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:53.652872 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:53.652951 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:53.653281 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:54.152839 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:54.152916 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:54.153228 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:54.153274 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:54.653199 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:54.653273 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:54.653604 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:55.153420 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:55.153510 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:55.153789 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:55.652811 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:55.652903 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:55.653294 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:56.152856 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:56.152934 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:56.153223 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:56.652860 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:56.652926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:56.653259 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:56.653316 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:57.152833 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:57.152911 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:57.153205 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:57.652906 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:57.652981 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:57.653323 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:58.152864 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:58.152937 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:58.153209 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:58.652840 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:58.652922 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:58.653217 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:59.152905 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:59.152981 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:59.153315 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:59.153381 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:59.652855 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:59.652932 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:59.653183 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:00.152906 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:00.153005 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:00.153357 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:00.653205 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:00.653291 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:00.653625 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:01.153283 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:01.153360 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:01.153628 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:01.153671 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:01.653416 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:01.653497 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:01.653884 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:02.153557 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:02.153633 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:02.154010 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:02.652736 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:02.652830 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:02.653106 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:03.152802 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:03.152877 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:03.153215 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:03.652921 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:03.652999 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:03.653309 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:03.653358 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:04.152885 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:04.152955 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:04.153320 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:04.653344 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:04.653416 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:04.653746 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:05.153560 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:05.153640 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:05.153974 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:05.652768 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:05.652867 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:05.653179 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:06.152859 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:06.152935 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:06.153224 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:06.153272 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:06.652921 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:06.653000 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:06.653306 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:07.152883 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:07.152957 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:07.153227 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:07.652826 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:07.652898 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:07.653243 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:08.152838 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:08.152914 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:08.153262 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:08.153318 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:08.652911 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:08.652979 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:08.653282 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:09.152916 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:09.152986 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:09.153263 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:09.652875 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:09.652954 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:09.653284 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:10.152865 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:10.152942 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:10.153226 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:10.653112 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:10.653192 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:10.653511 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:10.653577 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:11.153350 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:11.153429 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:11.153777 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:11.653085 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:11.653162 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:11.653429 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:12.152806 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:12.152904 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:12.153213 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:12.652905 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:12.652980 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:12.653311 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:13.152912 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:13.152981 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:13.153307 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:13.153367 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:13.653051 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:13.653127 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:13.653415 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:14.152821 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:14.152917 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:14.153215 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:14.653028 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:14.653106 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:14.653360 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:15.152838 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:15.152912 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:15.153210 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:15.653006 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:15.653081 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:15.653415 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:15.653467 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:16.152865 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:16.152934 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:16.153234 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:16.652830 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:16.652908 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:16.653204 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:17.152901 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:17.152974 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:17.153276 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:17.652850 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:17.652923 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:17.653180 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:18.152820 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:18.152893 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:18.153221 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:18.153274 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:18.652906 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:18.652988 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:18.653283 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:19.152887 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:19.152961 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:19.153207 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:19.652913 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:19.652992 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:19.653321 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:20.153018 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:20.153092 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:20.153437 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:20.153501 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:20.653011 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:20.653093 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:20.653372 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:21.153069 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:21.153153 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:21.153484 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:21.652850 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:21.652938 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:21.653322 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:22.153458 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:22.153530 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:22.153790 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:22.153833 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:22.653650 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:22.653724 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:22.654057 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:23.152779 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:23.152853 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:23.153175 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:23.652866 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:23.652964 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:23.653280 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:24.152829 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:24.152918 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:24.153276 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:24.653129 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:24.653203 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:24.653539 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:24.653595 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:25.153181 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:25.153256 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:25.153525 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:25.653492 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:25.653572 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:25.653896 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:26.153709 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:26.153785 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:26.154149 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:26.653439 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:26.653511 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:26.653845 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:26.653912 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:27.153641 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:27.153711 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:27.154059 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:27.653737 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:27.653813 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:27.654171 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:28.152737 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:28.152824 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:28.153136 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:28.652857 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:28.652929 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:28.653267 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:29.152816 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:29.152890 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:29.153304 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:29.153363 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:29.652877 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:29.652949 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:29.653201 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:30.152847 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:30.152926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:30.153286 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:30.653142 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:30.653226 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:30.653576 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:31.152865 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:31.152937 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:31.153239 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:31.652853 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:31.652930 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:31.653285 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:31.653341 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:32.153002 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:32.153077 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:32.153389 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:32.652855 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:32.652928 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:32.653191 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:33.152906 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:33.152985 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:33.153320 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:33.653030 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:33.653112 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:33.653463 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:33.653523 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:34.152867 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:34.152936 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:34.153191 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:34.653266 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:34.653343 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:34.653688 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:35.153480 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:35.153562 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:35.153920 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:35.653700 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:35.653779 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:35.654078 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:35.654124 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:36.152813 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:36.152899 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:36.153244 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:36.652824 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:36.652902 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:36.653244 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:37.152873 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:37.152947 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:37.153200 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:37.652845 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:37.652924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:37.653218 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:38.152831 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:38.152912 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:38.153208 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:38.153253 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:38.652887 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:38.652966 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:38.653228 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:39.152830 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:39.152913 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:39.153278 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:39.652866 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:39.652946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:39.653299 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:40.152861 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:40.152934 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:40.153228 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:40.153287 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:40.653031 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:40.653107 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:40.653447 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:41.152846 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:41.152920 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:41.153249 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:41.652882 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:41.652956 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:41.653222 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:42.152858 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:42.152950 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:42.153290 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:42.153350 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:42.653039 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:42.653114 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:42.653443 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:43.152861 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:43.152929 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:43.153196 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:43.652842 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:43.652917 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:43.653298 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:44.153021 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:44.153098 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:44.153446 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:44.153502 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:44.653394 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:44.653463 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:44.653758 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:45.153716 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:45.153844 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:45.154316 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:45.653443 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:45.653522 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:45.653863 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:46.153623 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:46.153701 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:46.153971 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:46.154014 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:46.653765 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:46.653843 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:46.654187 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:47.152841 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:47.152918 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:47.153244 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:47.652861 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:47.652938 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:47.653190 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:48.152864 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:48.152954 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:48.153355 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:48.653077 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:48.653151 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:48.653475 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:48.653535 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:49.152855 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:49.152929 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:49.153189 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:49.652829 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:49.652905 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:49.653255 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:50.152968 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:50.153052 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:50.153380 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:50.653140 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:50.653211 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:50.653679 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:50.653731 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:51.153473 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:51.153550 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:51.154738 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=1
	I1219 06:09:51.653546 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:51.653618 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:51.653958 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:52.153274 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:52.153349 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:52.153606 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:52.653351 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:52.653426 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:52.653752 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:52.653808 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:53.153430 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:53.153501 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:53.153810 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:53.653040 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:53.653137 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:53.653483 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:54.152833 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:54.152911 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:54.153264 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:54.652950 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:54.653032 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:54.653335 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:55.152873 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:55.152958 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:55.153273 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:55.153315 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:55.653562 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:55.653634 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:55.653988 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:56.152721 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:56.152824 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:56.153183 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:56.652868 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:56.652946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:56.653204 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:57.152895 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:57.152971 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:57.153299 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:57.153359 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:57.652826 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:57.652904 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:57.653235 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:58.152872 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:58.152942 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:58.153268 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:58.652975 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:58.653058 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:58.653396 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:59.153105 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:59.153184 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:59.153474 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:59.153520 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:59.652864 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:59.652944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:59.653267 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:00.152931 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:00.153036 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:00.153356 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:00.653257 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:00.653334 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:00.653658 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:01.153453 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:01.153528 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:01.153794 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:01.153845 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:01.653619 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:01.653697 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:01.654088 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:02.153731 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:02.153810 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:02.154155 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:02.652784 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:02.652868 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:02.653195 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:03.152838 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:03.152917 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:03.153278 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:03.652980 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:03.653056 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:03.653404 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:03.653465 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:04.152871 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:04.152950 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:04.153292 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:04.653187 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:04.653267 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:04.653580 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:05.152837 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:05.152916 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:05.153255 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:05.652984 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:05.653057 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:05.653348 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:06.153037 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:06.153117 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:06.153467 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:06.153522 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:06.653186 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:06.653261 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:06.653599 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:07.152860 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:07.152929 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:07.153189 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:07.652828 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:07.652907 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:07.653267 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:08.152859 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:08.152944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:08.153282 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:08.652877 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:08.652946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:08.653201 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:08.653242 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:09.152822 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:09.152898 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:09.153263 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:09.652859 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:09.652947 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:09.653294 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:10.152871 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:10.152950 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:10.153225 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:10.653128 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:10.653226 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:10.653575 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:10.653636 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:11.153427 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:11.153513 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:11.153854 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:11.653325 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:11.653393 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:11.653695 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:12.153493 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:12.153567 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:12.153867 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:12.653672 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:12.653754 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:12.654079 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:12.654129 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:13.152786 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:13.152870 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:13.153265 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:13.652847 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:13.652923 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:13.653234 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:14.152794 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:14.152866 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:14.153184 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:14.653147 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:14.653224 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:14.653478 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:15.152818 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:15.152898 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:15.153253 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:15.153301 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:15.653095 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:15.653176 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:15.653536 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:16.152872 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:16.152948 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:16.153210 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:16.652894 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:16.652982 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:16.653351 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:17.152883 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:17.152963 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:17.153312 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:17.153367 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:17.652869 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:17.652943 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:17.653235 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:18.152825 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:18.152903 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:18.153264 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:18.652851 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:18.652930 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:18.653272 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:19.152844 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:19.152921 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:19.153179 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:19.652851 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:19.652930 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:19.653284 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:19.653346 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:20.152854 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:20.152932 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:20.153299 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:20.653045 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:20.653125 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:20.653445 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:21.153150 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:21.153227 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:21.153575 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:21.652847 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:21.652931 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:21.653260 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:22.152864 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:22.152940 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:22.153207 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:22.153260 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:22.652895 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:22.652974 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:22.653308 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:23.152830 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:23.152911 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:23.153198 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:23.653479 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:23.653551 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:23.653818 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:24.153710 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:24.153800 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:24.154142 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:24.154201 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:24.653230 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:24.653310 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:24.653643 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:25.153415 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:25.153494 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:25.153825 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:25.652780 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:25.652869 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:25.653238 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:26.152955 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:26.153029 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:26.153332 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:26.652857 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:26.652926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:26.653203 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:26.653244 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:27.152819 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:27.152893 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:27.153237 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:27.652953 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:27.653040 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:27.653394 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:28.152864 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:28.152939 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:28.153198 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:28.652846 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:28.652921 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:28.653239 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:28.653296 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:29.153007 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:29.153109 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:29.153490 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:29.652904 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:29.653002 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:29.653393 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:30.152859 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:30.152941 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:30.153290 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:30.653036 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:30.653110 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:30.653469 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:30.653528 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:31.152866 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:31.152940 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:31.153201 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:31.652841 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:31.652917 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:31.653256 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:32.152981 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:32.153059 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:32.153421 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:32.652862 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:32.652934 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:32.653184 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:33.152815 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:33.152902 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:33.153207 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:33.153256 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:33.652858 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:33.652936 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:33.653273 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:34.153699 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:34.153778 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:34.154156 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:34.652927 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:34.653004 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:34.653344 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:35.152944 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:35.153053 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:35.153407 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:35.153473 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:35.653031 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:35.653108 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:35.653439 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:36.152985 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:36.153058 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:36.153410 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:36.652997 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:36.653074 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:36.653385 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:37.152858 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:37.152927 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:37.153178 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:37.652820 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:37.652898 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:37.653238 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:37.653293 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:38.152835 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:38.152918 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:38.153307 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:38.652893 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:38.652984 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:38.653356 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:39.152830 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:39.152909 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:39.153250 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:39.652830 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:39.652914 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:39.653251 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:40.152861 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:40.152939 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:40.153215 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:40.153267 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:40.653044 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:40.653127 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:40.653472 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:41.152846 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:41.152924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:41.153236 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:41.652860 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:41.652938 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:41.653202 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:42.152888 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:42.152978 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:42.153360 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:42.153425 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:42.653108 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:42.653190 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:42.653535 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:43.153293 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:43.153377 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:43.153699 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:43.653511 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:43.653596 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:43.653946 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:44.153626 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:44.153701 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:44.154058 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:44.154116 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:44.653518 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:44.653586 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:44.653839 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:45.153714 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:45.153803 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:45.154242 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:45.653103 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:45.653182 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:45.653567 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:46.152856 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:46.152937 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:46.153239 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:46.652826 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:46.652901 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:46.653220 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:46.653276 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:47.153000 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:47.153090 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:47.153484 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:47.652874 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:47.652945 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:47.653216 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:48.152829 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:48.152923 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:48.153272 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:48.653001 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:48.653085 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:48.653425 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:48.653480 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:49.152866 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:49.152944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:49.153206 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:49.652909 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:49.652997 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:49.653352 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:50.152932 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:50.153010 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:50.153347 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:50.653023 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:50.653100 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:50.653383 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:51.153061 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:51.153137 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:51.153452 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:51.153512 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:51.653210 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:51.653296 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:51.653657 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:52.153489 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:52.153557 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:52.153876 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:52.653692 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:52.653768 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:52.654090 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:53.152787 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:53.152864 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:53.153157 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:53.652784 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:53.652862 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:53.653125 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:53.653167 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:54.152851 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:54.152939 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:54.153283 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:54.653187 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:54.653265 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:54.653642 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:55.153555 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:55.153638 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:55.153984 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:55.652996 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:55.653078 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:55.653399 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:55.653461 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:56.153144 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:56.153229 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:56.153575 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:56.653124 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:56.653197 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:56.653482 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:57.152912 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:57.152988 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:57.153306 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:57.652848 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:57.652930 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:57.653287 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:58.152883 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:58.152951 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:58.153228 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:58.153273 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:58.652822 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:58.652903 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:58.653236 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:59.152792 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:59.152869 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:59.153185 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:59.652732 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:59.652827 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:59.653140 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:00.152931 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:00.153022 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:00.153339 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:00.153391 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:00.653248 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:00.653323 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:00.653669 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:01.153449 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:01.153521 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:01.153868 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:01.653237 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:01.653336 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:01.653684 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:02.153489 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:02.153575 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:02.153909 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:02.153978 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:02.653742 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:02.653822 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:02.654154 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:03.152847 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:03.152925 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:03.153263 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:03.652860 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:03.652939 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:03.653311 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:04.152862 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:04.152937 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:04.153190 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:04.653274 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:04.653353 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:04.653687 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:04.653753 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:05.153511 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:05.153585 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:05.153947 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:05.652710 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:05.652796 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:05.653061 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:06.152795 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:06.152871 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:06.153232 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:06.652955 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:06.653039 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:06.653379 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:07.152874 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:07.152959 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:07.153220 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:07.153260 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:07.652857 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:07.652932 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:07.653275 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:08.152847 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:08.152921 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:08.153280 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:08.652915 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:08.652997 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:08.653258 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:09.152863 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:09.152938 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:09.153313 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:09.153369 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:09.653051 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:09.653130 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:09.653466 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:10.152899 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:10.152977 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:10.153296 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:10.653043 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:10.653165 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:10.653488 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:11.152868 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:11.152944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:11.153273 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:11.653413 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:11.653483 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:11.653817 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:11.653864 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:12.153612 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:12.153686 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:12.154026 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:12.652742 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:12.652850 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:12.653128 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:13.152820 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:13.152886 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:13.153131 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:13.652843 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:13.652924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:13.653273 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:14.152867 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:14.152944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:14.153275 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:14.153331 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:14.653226 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:14.653309 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:14.653648 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:15.153413 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:15.153488 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:15.153804 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:15.652744 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:15.652844 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:15.653142 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:16.152784 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:16.152853 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:16.153159 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:16.652816 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:16.652891 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:16.653186 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:16.653233 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:17.152845 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:17.152926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:17.153251 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:17.652870 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:17.652936 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:17.653201 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:18.152833 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:18.152914 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:18.153264 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:18.652945 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:18.653025 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:18.653318 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:18.653380 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:19.152875 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:19.152949 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:19.153231 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:19.652844 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:19.652920 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:19.653256 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:20.152860 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:20.152935 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:20.153299 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:20.653014 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:20.653092 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:20.653351 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:21.152845 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:21.152918 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:21.153277 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:21.153341 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:21.653010 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:21.653087 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:21.653481 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:22.152850 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:22.152924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:22.153186 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:22.652828 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:22.652906 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:22.653231 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:23.152837 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:23.152923 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:23.153272 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:23.652867 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:23.652942 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:23.653201 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:23.653241 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:24.152900 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:24.152984 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:24.153313 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:24.653189 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:24.653270 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:24.653611 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:25.152919 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:25.152989 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:25.153298 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:25.653028 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:25.653101 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:25.653438 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:25.653492 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:26.153176 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:26.153256 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:26.153570 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:26.652912 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:26.652986 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:26.653378 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:27.152845 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:27.152921 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:27.153259 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:27.652839 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:27.652922 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:27.653245 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:28.152867 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:28.152938 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:28.153233 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:28.153276 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:28.652833 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:28.652907 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:28.653249 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:29.152989 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:29.153064 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:29.153396 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:29.652869 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:29.652941 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:29.653239 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:30.152845 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:30.152926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:30.153258 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:30.153317 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:30.653045 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:30.653118 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:30.653429 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:31.152869 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:31.152943 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:31.153207 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:31.652855 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:31.652930 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:31.653292 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:32.152836 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:32.152919 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:32.153241 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:32.652864 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:32.652940 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:32.653216 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:32.653263 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:33.152791 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:33.152875 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:33.153206 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:33.652896 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:33.652977 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:33.653320 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:34.152876 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:34.152959 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:34.153289 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:34.653352 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:34.653430 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:34.653807 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:34.653869 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:35.153637 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:35.153718 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:35.154044 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:35.652946 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:35.653021 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:35.653340 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:36.152965 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:36.153049 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:36.153384 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:36.653133 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:36.653213 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:36.653535 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:37.153283 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:37.153355 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:37.153669 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:37.153731 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:37.653516 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:37.653597 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:37.653938 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:38.153755 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:38.153833 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:38.154248 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:38.652881 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:38.652956 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:38.653272 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:39.152844 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:39.152925 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:39.153296 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:39.652881 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:39.652964 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:39.653289 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:39.653347 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:40.152873 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:40.152948 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:40.153212 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:40.653026 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:40.653107 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:40.653447 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:41.152846 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:41.152921 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:41.153249 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:41.652949 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:41.653030 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:41.653297 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:42.152907 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:42.153021 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:42.153435 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:42.153505 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:42.653182 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:42.653258 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:42.653594 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:43.152852 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:43.152926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:43.153183 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:43.652858 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:43.652938 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:43.653276 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:44.152855 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:44.152938 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:44.153264 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:44.653236 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:44.653312 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:44.653574 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:44.653614 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:45.153508 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:45.153630 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:45.154114 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:45.653037 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:45.653120 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:45.653478 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:46.152854 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:46.152928 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:46.153280 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:46.652857 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:46.652941 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:46.653286 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:47.152995 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:47.153070 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:47.153369 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:47.153415 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:47.652884 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:47.652954 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:47.653305 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:48.152817 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:48.152895 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:48.153237 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:48.652848 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:48.652925 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:48.653243 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:49.152858 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:49.152943 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:49.153213 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:49.652807 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:49.652884 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:49.653230 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:49.653285 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:50.152972 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:50.153050 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:50.153344 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:50.653187 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:50.653264 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:50.653522 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:51.153213 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:51.153287 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:51.153583 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:51.653360 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:51.653435 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:51.653779 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:51.653840 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:52.153070 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:52.153143 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:52.153403 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:52.653101 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:52.653190 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:52.653483 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:53.152800 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:53.152878 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:53.153196 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:53.652868 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:53.652937 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:53.653274 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:54.152820 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:54.152901 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:54.153215 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:54.153274 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:54.653291 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:54.653363 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:54.653706 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:55.152998 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:55.153089 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:55.153429 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:55.653064 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:55.653125 2059048 node_ready.go:38] duration metric: took 6m0.000540604s for node "functional-006924" to be "Ready" ...
	I1219 06:11:55.656290 2059048 out.go:203] 
	W1219 06:11:55.659114 2059048 out.go:285] X Exiting due to GUEST_START: failed to start node: wait 6m0s for node: waiting for node to be ready: WaitNodeCondition: context deadline exceeded
	W1219 06:11:55.659135 2059048 out.go:285] * 
	W1219 06:11:55.661307 2059048 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1219 06:11:55.664349 2059048 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 19 06:12:03 functional-006924 containerd[5249]: time="2025-12-19T06:12:03.448591666Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 19 06:12:04 functional-006924 containerd[5249]: time="2025-12-19T06:12:04.498334228Z" level=info msg="No images store for sha256:a1f83055284ec302ac691d8677946d8b4e772fb7071d39ada1cc9184cb70814b"
	Dec 19 06:12:04 functional-006924 containerd[5249]: time="2025-12-19T06:12:04.500470113Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:latest\""
	Dec 19 06:12:04 functional-006924 containerd[5249]: time="2025-12-19T06:12:04.508749787Z" level=info msg="ImageCreate event name:\"sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 19 06:12:04 functional-006924 containerd[5249]: time="2025-12-19T06:12:04.509292299Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:latest\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 19 06:12:05 functional-006924 containerd[5249]: time="2025-12-19T06:12:05.455765557Z" level=info msg="No images store for sha256:77bd2e9ec09b9f03e181ef448174ba62f2bf72888843372bb729abc0e9bb591d"
	Dec 19 06:12:05 functional-006924 containerd[5249]: time="2025-12-19T06:12:05.457957869Z" level=info msg="ImageCreate event name:\"docker.io/library/minikube-local-cache-test:functional-006924\""
	Dec 19 06:12:05 functional-006924 containerd[5249]: time="2025-12-19T06:12:05.465100463Z" level=info msg="ImageCreate event name:\"sha256:51d14939a1995a88415fccb269ec40dc043aefbcf5035f79ba02097bb3909863\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 19 06:12:05 functional-006924 containerd[5249]: time="2025-12-19T06:12:05.465568210Z" level=info msg="ImageUpdate event name:\"docker.io/library/minikube-local-cache-test:functional-006924\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 19 06:12:06 functional-006924 containerd[5249]: time="2025-12-19T06:12:06.284212603Z" level=info msg="RemoveImage \"registry.k8s.io/pause:latest\""
	Dec 19 06:12:06 functional-006924 containerd[5249]: time="2025-12-19T06:12:06.286648914Z" level=info msg="ImageDelete event name:\"registry.k8s.io/pause:latest\""
	Dec 19 06:12:06 functional-006924 containerd[5249]: time="2025-12-19T06:12:06.288918863Z" level=info msg="ImageDelete event name:\"sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a\""
	Dec 19 06:12:06 functional-006924 containerd[5249]: time="2025-12-19T06:12:06.302188402Z" level=info msg="RemoveImage \"registry.k8s.io/pause:latest\" returns successfully"
	Dec 19 06:12:07 functional-006924 containerd[5249]: time="2025-12-19T06:12:07.264422448Z" level=info msg="RemoveImage \"registry.k8s.io/pause:3.1\""
	Dec 19 06:12:07 functional-006924 containerd[5249]: time="2025-12-19T06:12:07.266842808Z" level=info msg="ImageDelete event name:\"registry.k8s.io/pause:3.1\""
	Dec 19 06:12:07 functional-006924 containerd[5249]: time="2025-12-19T06:12:07.269981025Z" level=info msg="ImageDelete event name:\"sha256:8057e0500773a37cde2cff041eb13ebd68c748419a2fbfd1dfb5bf38696cc8e5\""
	Dec 19 06:12:07 functional-006924 containerd[5249]: time="2025-12-19T06:12:07.276420564Z" level=info msg="RemoveImage \"registry.k8s.io/pause:3.1\" returns successfully"
	Dec 19 06:12:07 functional-006924 containerd[5249]: time="2025-12-19T06:12:07.444119016Z" level=info msg="No images store for sha256:a1f83055284ec302ac691d8677946d8b4e772fb7071d39ada1cc9184cb70814b"
	Dec 19 06:12:07 functional-006924 containerd[5249]: time="2025-12-19T06:12:07.447262763Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:latest\""
	Dec 19 06:12:07 functional-006924 containerd[5249]: time="2025-12-19T06:12:07.454411610Z" level=info msg="ImageCreate event name:\"sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 19 06:12:07 functional-006924 containerd[5249]: time="2025-12-19T06:12:07.454763778Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:latest\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 19 06:12:07 functional-006924 containerd[5249]: time="2025-12-19T06:12:07.573995122Z" level=info msg="No images store for sha256:3ac89611d5efd8eb74174b1f04c33b7e73b651cec35b5498caf0cfdd2efd7d48"
	Dec 19 06:12:07 functional-006924 containerd[5249]: time="2025-12-19T06:12:07.576369794Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.1\""
	Dec 19 06:12:07 functional-006924 containerd[5249]: time="2025-12-19T06:12:07.583465595Z" level=info msg="ImageCreate event name:\"sha256:8057e0500773a37cde2cff041eb13ebd68c748419a2fbfd1dfb5bf38696cc8e5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 19 06:12:07 functional-006924 containerd[5249]: time="2025-12-19T06:12:07.584066947Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:12:09.389661    9215 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:12:09.390316    9215 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:12:09.391855    9215 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:12:09.392284    9215 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:12:09.393719    9215 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec19 04:47] overlayfs: idmapped layers are currently not supported
	[Dec19 04:48] overlayfs: idmapped layers are currently not supported
	[Dec19 04:49] overlayfs: idmapped layers are currently not supported
	[Dec19 04:51] overlayfs: idmapped layers are currently not supported
	[Dec19 04:53] overlayfs: idmapped layers are currently not supported
	[Dec19 05:03] overlayfs: idmapped layers are currently not supported
	[Dec19 05:04] overlayfs: idmapped layers are currently not supported
	[Dec19 05:05] overlayfs: idmapped layers are currently not supported
	[Dec19 05:06] overlayfs: idmapped layers are currently not supported
	[ +12.793339] overlayfs: idmapped layers are currently not supported
	[Dec19 05:07] overlayfs: idmapped layers are currently not supported
	[Dec19 05:08] overlayfs: idmapped layers are currently not supported
	[Dec19 05:09] overlayfs: idmapped layers are currently not supported
	[Dec19 05:10] overlayfs: idmapped layers are currently not supported
	[Dec19 05:11] overlayfs: idmapped layers are currently not supported
	[Dec19 05:13] overlayfs: idmapped layers are currently not supported
	[Dec19 05:14] overlayfs: idmapped layers are currently not supported
	[Dec19 05:32] overlayfs: idmapped layers are currently not supported
	[Dec19 05:33] overlayfs: idmapped layers are currently not supported
	[Dec19 05:35] overlayfs: idmapped layers are currently not supported
	[Dec19 05:36] overlayfs: idmapped layers are currently not supported
	[Dec19 05:38] overlayfs: idmapped layers are currently not supported
	[Dec19 05:39] overlayfs: idmapped layers are currently not supported
	[Dec19 05:40] overlayfs: idmapped layers are currently not supported
	[Dec19 05:42] kauditd_printk_skb: 8 callbacks suppressed
	
	
	==> kernel <==
	 06:12:09 up 10:54,  0 user,  load average: 0.62, 0.38, 0.75
	Linux functional-006924 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 19 06:12:06 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 06:12:07 functional-006924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 824.
	Dec 19 06:12:07 functional-006924 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:12:07 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:12:07 functional-006924 kubelet[9013]: E1219 06:12:07.224101    9013 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 06:12:07 functional-006924 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 06:12:07 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 06:12:07 functional-006924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 825.
	Dec 19 06:12:07 functional-006924 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:12:07 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:12:07 functional-006924 kubelet[9111]: E1219 06:12:07.974606    9111 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 06:12:07 functional-006924 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 06:12:07 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 06:12:08 functional-006924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 826.
	Dec 19 06:12:08 functional-006924 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:12:08 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:12:08 functional-006924 kubelet[9132]: E1219 06:12:08.713658    9132 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 06:12:08 functional-006924 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 06:12:08 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 06:12:09 functional-006924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 827.
	Dec 19 06:12:09 functional-006924 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:12:09 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:12:09 functional-006924 kubelet[9220]: E1219 06:12:09.457177    9220 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 06:12:09 functional-006924 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 06:12:09 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:263: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-006924 -n functional-006924
helpers_test.go:263: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-006924 -n functional-006924: exit status 2 (356.00798ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:263: status error: exit status 2 (may be ok)
helpers_test.go:265: "functional-006924" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmd (2.24s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmdDirectly (2.39s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmdDirectly
functional_test.go:756: (dbg) Run:  out/kubectl --context functional-006924 get pods
functional_test.go:756: (dbg) Non-zero exit: out/kubectl --context functional-006924 get pods: exit status 1 (100.094662ms)

                                                
                                                
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:759: failed to run kubectl directly. args "out/kubectl --context functional-006924 get pods": exit status 1
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:224: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmdDirectly]: network settings <======
helpers_test.go:231: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:239: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmdDirectly]: docker inspect <======
helpers_test.go:240: (dbg) Run:  docker inspect functional-006924
helpers_test.go:244: (dbg) docker inspect functional-006924:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6",
	        "Created": "2025-12-19T05:57:32.987616309Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 2053574,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-19T05:57:33.050252475Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:1411dfa4fea1291ce69fcd55acb99f3fbff3e701cee30fdd4f0b2561ac0ef6b0",
	        "ResolvConfPath": "/var/lib/docker/containers/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6/hostname",
	        "HostsPath": "/var/lib/docker/containers/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6/hosts",
	        "LogPath": "/var/lib/docker/containers/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6-json.log",
	        "Name": "/functional-006924",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-006924:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-006924",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6",
	                "LowerDir": "/var/lib/docker/overlay2/37e64f46ab514c62069e4338bcac4816059c7746e8935882d7526ec5f3a01f73-init/diff:/var/lib/docker/overlay2/00358d85eab3b52f9d297862c5ac97673efd866f7bb8f8781bf0c1744f50abc5/diff",
	                "MergedDir": "/var/lib/docker/overlay2/37e64f46ab514c62069e4338bcac4816059c7746e8935882d7526ec5f3a01f73/merged",
	                "UpperDir": "/var/lib/docker/overlay2/37e64f46ab514c62069e4338bcac4816059c7746e8935882d7526ec5f3a01f73/diff",
	                "WorkDir": "/var/lib/docker/overlay2/37e64f46ab514c62069e4338bcac4816059c7746e8935882d7526ec5f3a01f73/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-006924",
	                "Source": "/var/lib/docker/volumes/functional-006924/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-006924",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-006924",
	                "name.minikube.sigs.k8s.io": "functional-006924",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "c06ab2bd44169716d410789ed39ed6e7c04e20cbf7fddb96691439282b9c97ca",
	            "SandboxKey": "/var/run/docker/netns/c06ab2bd4416",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34704"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34705"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34708"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34706"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34707"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-006924": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "42:2f:87:6a:a8:7b",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "f63e8dc2cff83663f8a4d14108f192e61e457410fa4fc720cd9630dbf354815d",
	                    "EndpointID": "aa2b1cbd90d5c1f6130481423d97f82d974d4197e41ad0dbe3b7e51b22c8b4cc",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-006924",
	                        "651d0d6ef1db"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:248: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-006924 -n functional-006924
helpers_test.go:248: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-006924 -n functional-006924: exit status 2 (305.022615ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:248: status error: exit status 2 (may be ok)
helpers_test.go:253: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmdDirectly FAILED: start of post-mortem logs <<<
helpers_test.go:254: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmdDirectly]: minikube logs <======
helpers_test.go:256: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 logs -n 25
helpers_test.go:256: (dbg) Done: out/minikube-linux-arm64 -p functional-006924 logs -n 25: (1.000715996s)
helpers_test.go:261: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmdDirectly logs: 
-- stdout --
	
	==> Audit <==
	┌────────────────┬───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│    COMMAND     │                                                                         ARGS                                                                          │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├────────────────┼───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ image          │ functional-125117 image build -t localhost/my-image:functional-125117 testdata/build --alsologtostderr                                                │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image ls                                                                                                                            │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image ls --format json --alsologtostderr                                                                                            │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image ls --format table --alsologtostderr                                                                                           │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ update-context │ functional-125117 update-context --alsologtostderr -v=2                                                                                               │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ update-context │ functional-125117 update-context --alsologtostderr -v=2                                                                                               │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ update-context │ functional-125117 update-context --alsologtostderr -v=2                                                                                               │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ delete         │ -p functional-125117                                                                                                                                  │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ start          │ -p functional-006924 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-rc.1 │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │                     │
	│ start          │ -p functional-006924 --alsologtostderr -v=8                                                                                                           │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:05 UTC │                     │
	│ cache          │ functional-006924 cache add registry.k8s.io/pause:3.1                                                                                                 │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ cache          │ functional-006924 cache add registry.k8s.io/pause:3.3                                                                                                 │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ cache          │ functional-006924 cache add registry.k8s.io/pause:latest                                                                                              │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ cache          │ functional-006924 cache add minikube-local-cache-test:functional-006924                                                                               │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ cache          │ functional-006924 cache delete minikube-local-cache-test:functional-006924                                                                            │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ cache          │ delete registry.k8s.io/pause:3.3                                                                                                                      │ minikube          │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ cache          │ list                                                                                                                                                  │ minikube          │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ ssh            │ functional-006924 ssh sudo crictl images                                                                                                              │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ ssh            │ functional-006924 ssh sudo crictl rmi registry.k8s.io/pause:latest                                                                                    │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ ssh            │ functional-006924 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                               │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │                     │
	│ cache          │ functional-006924 cache reload                                                                                                                        │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ ssh            │ functional-006924 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                               │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ cache          │ delete registry.k8s.io/pause:3.1                                                                                                                      │ minikube          │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ cache          │ delete registry.k8s.io/pause:latest                                                                                                                   │ minikube          │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ kubectl        │ functional-006924 kubectl -- --context functional-006924 get pods                                                                                     │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │                     │
	└────────────────┴───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/19 06:05:50
	Running on machine: ip-172-31-30-239
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1219 06:05:50.537990 2059048 out.go:360] Setting OutFile to fd 1 ...
	I1219 06:05:50.538849 2059048 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 06:05:50.538894 2059048 out.go:374] Setting ErrFile to fd 2...
	I1219 06:05:50.538913 2059048 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 06:05:50.539188 2059048 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22230-1998525/.minikube/bin
	I1219 06:05:50.539610 2059048 out.go:368] Setting JSON to false
	I1219 06:05:50.540502 2059048 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":38897,"bootTime":1766085454,"procs":155,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1219 06:05:50.540601 2059048 start.go:143] virtualization:  
	I1219 06:05:50.544140 2059048 out.go:179] * [functional-006924] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1219 06:05:50.547152 2059048 out.go:179]   - MINIKUBE_LOCATION=22230
	I1219 06:05:50.547218 2059048 notify.go:221] Checking for updates...
	I1219 06:05:50.550931 2059048 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1219 06:05:50.553869 2059048 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22230-1998525/kubeconfig
	I1219 06:05:50.556730 2059048 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22230-1998525/.minikube
	I1219 06:05:50.559634 2059048 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1219 06:05:50.562450 2059048 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1219 06:05:50.565702 2059048 config.go:182] Loaded profile config "functional-006924": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1219 06:05:50.565828 2059048 driver.go:422] Setting default libvirt URI to qemu:///system
	I1219 06:05:50.590709 2059048 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1219 06:05:50.590846 2059048 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1219 06:05:50.653898 2059048 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-19 06:05:50.644590744 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1219 06:05:50.654020 2059048 docker.go:319] overlay module found
	I1219 06:05:50.657204 2059048 out.go:179] * Using the docker driver based on existing profile
	I1219 06:05:50.660197 2059048 start.go:309] selected driver: docker
	I1219 06:05:50.660214 2059048 start.go:928] validating driver "docker" against &{Name:functional-006924 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APISer
verName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableC
oreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1219 06:05:50.660310 2059048 start.go:939] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1219 06:05:50.660408 2059048 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1219 06:05:50.713439 2059048 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-19 06:05:50.704333478 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1219 06:05:50.713872 2059048 cni.go:84] Creating CNI manager for ""
	I1219 06:05:50.713935 2059048 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1219 06:05:50.713992 2059048 start.go:353] cluster config:
	{Name:functional-006924 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Con
tainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath:
StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1219 06:05:50.717210 2059048 out.go:179] * Starting "functional-006924" primary control-plane node in "functional-006924" cluster
	I1219 06:05:50.719980 2059048 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1219 06:05:50.722935 2059048 out.go:179] * Pulling base image v0.0.48-1765966054-22186 ...
	I1219 06:05:50.726070 2059048 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1219 06:05:50.726124 2059048 preload.go:203] Found local preload: /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4
	I1219 06:05:50.726135 2059048 cache.go:65] Caching tarball of preloaded images
	I1219 06:05:50.726179 2059048 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon
	I1219 06:05:50.726225 2059048 preload.go:238] Found /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4 in cache, skipping download
	I1219 06:05:50.726236 2059048 cache.go:68] Finished verifying existence of preloaded tar for v1.35.0-rc.1 on containerd
	I1219 06:05:50.726339 2059048 profile.go:143] Saving config to /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/config.json ...
	I1219 06:05:50.745888 2059048 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon, skipping pull
	I1219 06:05:50.745915 2059048 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 exists in daemon, skipping load
	I1219 06:05:50.745932 2059048 cache.go:243] Successfully downloaded all kic artifacts
	I1219 06:05:50.745963 2059048 start.go:360] acquireMachinesLock for functional-006924: {Name:mkc84f48e83d18024791d45db780f3ccd746613a Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1219 06:05:50.746023 2059048 start.go:364] duration metric: took 37.752µs to acquireMachinesLock for "functional-006924"
	I1219 06:05:50.746049 2059048 start.go:96] Skipping create...Using existing machine configuration
	I1219 06:05:50.746059 2059048 fix.go:54] fixHost starting: 
	I1219 06:05:50.746334 2059048 cli_runner.go:164] Run: docker container inspect functional-006924 --format={{.State.Status}}
	I1219 06:05:50.762745 2059048 fix.go:112] recreateIfNeeded on functional-006924: state=Running err=<nil>
	W1219 06:05:50.762777 2059048 fix.go:138] unexpected machine state, will restart: <nil>
	I1219 06:05:50.765990 2059048 out.go:252] * Updating the running docker "functional-006924" container ...
	I1219 06:05:50.766020 2059048 machine.go:94] provisionDockerMachine start ...
	I1219 06:05:50.766101 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:50.782668 2059048 main.go:144] libmachine: Using SSH client type: native
	I1219 06:05:50.783000 2059048 main.go:144] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db5e0] 0x3ddae0 <nil>  [] 0s} 127.0.0.1 34704 <nil> <nil>}
	I1219 06:05:50.783017 2059048 main.go:144] libmachine: About to run SSH command:
	hostname
	I1219 06:05:50.940618 2059048 main.go:144] libmachine: SSH cmd err, output: <nil>: functional-006924
	
	I1219 06:05:50.940641 2059048 ubuntu.go:182] provisioning hostname "functional-006924"
	I1219 06:05:50.940708 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:50.964854 2059048 main.go:144] libmachine: Using SSH client type: native
	I1219 06:05:50.965181 2059048 main.go:144] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db5e0] 0x3ddae0 <nil>  [] 0s} 127.0.0.1 34704 <nil> <nil>}
	I1219 06:05:50.965199 2059048 main.go:144] libmachine: About to run SSH command:
	sudo hostname functional-006924 && echo "functional-006924" | sudo tee /etc/hostname
	I1219 06:05:51.129720 2059048 main.go:144] libmachine: SSH cmd err, output: <nil>: functional-006924
	
	I1219 06:05:51.129816 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:51.147357 2059048 main.go:144] libmachine: Using SSH client type: native
	I1219 06:05:51.147663 2059048 main.go:144] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db5e0] 0x3ddae0 <nil>  [] 0s} 127.0.0.1 34704 <nil> <nil>}
	I1219 06:05:51.147686 2059048 main.go:144] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-006924' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-006924/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-006924' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1219 06:05:51.301923 2059048 main.go:144] libmachine: SSH cmd err, output: <nil>: 
	I1219 06:05:51.301949 2059048 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/22230-1998525/.minikube CaCertPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/22230-1998525/.minikube}
	I1219 06:05:51.301977 2059048 ubuntu.go:190] setting up certificates
	I1219 06:05:51.301985 2059048 provision.go:84] configureAuth start
	I1219 06:05:51.302047 2059048 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-006924
	I1219 06:05:51.323653 2059048 provision.go:143] copyHostCerts
	I1219 06:05:51.323700 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.pem
	I1219 06:05:51.323742 2059048 exec_runner.go:144] found /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.pem, removing ...
	I1219 06:05:51.323756 2059048 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.pem
	I1219 06:05:51.323832 2059048 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.pem (1078 bytes)
	I1219 06:05:51.323915 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/22230-1998525/.minikube/cert.pem
	I1219 06:05:51.323932 2059048 exec_runner.go:144] found /home/jenkins/minikube-integration/22230-1998525/.minikube/cert.pem, removing ...
	I1219 06:05:51.323937 2059048 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22230-1998525/.minikube/cert.pem
	I1219 06:05:51.323964 2059048 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/22230-1998525/.minikube/cert.pem (1123 bytes)
	I1219 06:05:51.324003 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/22230-1998525/.minikube/key.pem
	I1219 06:05:51.324018 2059048 exec_runner.go:144] found /home/jenkins/minikube-integration/22230-1998525/.minikube/key.pem, removing ...
	I1219 06:05:51.324022 2059048 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22230-1998525/.minikube/key.pem
	I1219 06:05:51.324044 2059048 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/22230-1998525/.minikube/key.pem (1671 bytes)
	I1219 06:05:51.324090 2059048 provision.go:117] generating server cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca-key.pem org=jenkins.functional-006924 san=[127.0.0.1 192.168.49.2 functional-006924 localhost minikube]
	I1219 06:05:51.441821 2059048 provision.go:177] copyRemoteCerts
	I1219 06:05:51.441886 2059048 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1219 06:05:51.441926 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:51.459787 2059048 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:05:51.570296 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I1219 06:05:51.570372 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1219 06:05:51.588363 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server.pem -> /etc/docker/server.pem
	I1219 06:05:51.588477 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1219 06:05:51.605684 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I1219 06:05:51.605798 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1219 06:05:51.623473 2059048 provision.go:87] duration metric: took 321.473451ms to configureAuth
	I1219 06:05:51.623556 2059048 ubuntu.go:206] setting minikube options for container-runtime
	I1219 06:05:51.623741 2059048 config.go:182] Loaded profile config "functional-006924": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1219 06:05:51.623756 2059048 machine.go:97] duration metric: took 857.728961ms to provisionDockerMachine
	I1219 06:05:51.623765 2059048 start.go:293] postStartSetup for "functional-006924" (driver="docker")
	I1219 06:05:51.623788 2059048 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1219 06:05:51.623849 2059048 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1219 06:05:51.623892 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:51.641371 2059048 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:05:51.760842 2059048 ssh_runner.go:195] Run: cat /etc/os-release
	I1219 06:05:51.764225 2059048 command_runner.go:130] > PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	I1219 06:05:51.764245 2059048 command_runner.go:130] > NAME="Debian GNU/Linux"
	I1219 06:05:51.764250 2059048 command_runner.go:130] > VERSION_ID="12"
	I1219 06:05:51.764255 2059048 command_runner.go:130] > VERSION="12 (bookworm)"
	I1219 06:05:51.764259 2059048 command_runner.go:130] > VERSION_CODENAME=bookworm
	I1219 06:05:51.764263 2059048 command_runner.go:130] > ID=debian
	I1219 06:05:51.764268 2059048 command_runner.go:130] > HOME_URL="https://www.debian.org/"
	I1219 06:05:51.764273 2059048 command_runner.go:130] > SUPPORT_URL="https://www.debian.org/support"
	I1219 06:05:51.764281 2059048 command_runner.go:130] > BUG_REPORT_URL="https://bugs.debian.org/"
	I1219 06:05:51.764323 2059048 main.go:144] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1219 06:05:51.764339 2059048 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1219 06:05:51.764350 2059048 filesync.go:126] Scanning /home/jenkins/minikube-integration/22230-1998525/.minikube/addons for local assets ...
	I1219 06:05:51.764404 2059048 filesync.go:126] Scanning /home/jenkins/minikube-integration/22230-1998525/.minikube/files for local assets ...
	I1219 06:05:51.764485 2059048 filesync.go:149] local asset: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem -> 20003862.pem in /etc/ssl/certs
	I1219 06:05:51.764491 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem -> /etc/ssl/certs/20003862.pem
	I1219 06:05:51.764572 2059048 filesync.go:149] local asset: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/test/nested/copy/2000386/hosts -> hosts in /etc/test/nested/copy/2000386
	I1219 06:05:51.764576 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/test/nested/copy/2000386/hosts -> /etc/test/nested/copy/2000386/hosts
	I1219 06:05:51.764619 2059048 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/2000386
	I1219 06:05:51.772196 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem --> /etc/ssl/certs/20003862.pem (1708 bytes)
	I1219 06:05:51.790438 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/test/nested/copy/2000386/hosts --> /etc/test/nested/copy/2000386/hosts (40 bytes)
	I1219 06:05:51.808099 2059048 start.go:296] duration metric: took 184.303334ms for postStartSetup
	I1219 06:05:51.808203 2059048 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1219 06:05:51.808277 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:51.825566 2059048 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:05:51.929610 2059048 command_runner.go:130] > 14%
	I1219 06:05:51.930200 2059048 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1219 06:05:51.934641 2059048 command_runner.go:130] > 169G
	I1219 06:05:51.935117 2059048 fix.go:56] duration metric: took 1.189053781s for fixHost
	I1219 06:05:51.935139 2059048 start.go:83] releasing machines lock for "functional-006924", held for 1.189101272s
	I1219 06:05:51.935225 2059048 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-006924
	I1219 06:05:51.954055 2059048 ssh_runner.go:195] Run: cat /version.json
	I1219 06:05:51.954105 2059048 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1219 06:05:51.954110 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:51.954164 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:51.979421 2059048 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:05:51.998216 2059048 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:05:52.088735 2059048 command_runner.go:130] > {"iso_version": "v1.37.0-1765846775-22141", "kicbase_version": "v0.0.48-1765966054-22186", "minikube_version": "v1.37.0", "commit": "c344550999bcbb78f38b2df057224788bb2d30b2"}
	I1219 06:05:52.088901 2059048 ssh_runner.go:195] Run: systemctl --version
	I1219 06:05:52.184102 2059048 command_runner.go:130] > <a href="https://github.com/kubernetes/registry.k8s.io">Temporary Redirect</a>.
	I1219 06:05:52.186843 2059048 command_runner.go:130] > systemd 252 (252.39-1~deb12u1)
	I1219 06:05:52.186921 2059048 command_runner.go:130] > +PAM +AUDIT +SELINUX +APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT +QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified
	I1219 06:05:52.187021 2059048 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I1219 06:05:52.191424 2059048 command_runner.go:130] ! stat: cannot statx '/etc/cni/net.d/*loopback.conf*': No such file or directory
	W1219 06:05:52.191590 2059048 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1219 06:05:52.191669 2059048 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1219 06:05:52.199647 2059048 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1219 06:05:52.199671 2059048 start.go:496] detecting cgroup driver to use...
	I1219 06:05:52.199702 2059048 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1219 06:05:52.199771 2059048 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1219 06:05:52.215530 2059048 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1219 06:05:52.228927 2059048 docker.go:218] disabling cri-docker service (if available) ...
	I1219 06:05:52.229039 2059048 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1219 06:05:52.245166 2059048 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1219 06:05:52.258582 2059048 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1219 06:05:52.378045 2059048 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1219 06:05:52.513092 2059048 docker.go:234] disabling docker service ...
	I1219 06:05:52.513180 2059048 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1219 06:05:52.528704 2059048 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1219 06:05:52.542109 2059048 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1219 06:05:52.652456 2059048 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1219 06:05:52.767269 2059048 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1219 06:05:52.781039 2059048 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1219 06:05:52.797281 2059048 command_runner.go:130] > runtime-endpoint: unix:///run/containerd/containerd.sock
	I1219 06:05:52.797396 2059048 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1219 06:05:52.807020 2059048 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1219 06:05:52.816571 2059048 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1219 06:05:52.816661 2059048 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1219 06:05:52.826225 2059048 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1219 06:05:52.835109 2059048 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1219 06:05:52.843741 2059048 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1219 06:05:52.852504 2059048 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1219 06:05:52.860160 2059048 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1219 06:05:52.868883 2059048 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1219 06:05:52.877906 2059048 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1219 06:05:52.887403 2059048 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1219 06:05:52.894024 2059048 command_runner.go:130] > net.bridge.bridge-nf-call-iptables = 1
	I1219 06:05:52.894921 2059048 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1219 06:05:52.902164 2059048 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1219 06:05:53.021703 2059048 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1219 06:05:53.168216 2059048 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1219 06:05:53.168331 2059048 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1219 06:05:53.171951 2059048 command_runner.go:130] >   File: /run/containerd/containerd.sock
	I1219 06:05:53.172022 2059048 command_runner.go:130] >   Size: 0         	Blocks: 0          IO Block: 4096   socket
	I1219 06:05:53.172043 2059048 command_runner.go:130] > Device: 0,72	Inode: 1614        Links: 1
	I1219 06:05:53.172065 2059048 command_runner.go:130] > Access: (0660/srw-rw----)  Uid: (    0/    root)   Gid: (    0/    root)
	I1219 06:05:53.172084 2059048 command_runner.go:130] > Access: 2025-12-19 06:05:53.119867628 +0000
	I1219 06:05:53.172112 2059048 command_runner.go:130] > Modify: 2025-12-19 06:05:53.119867628 +0000
	I1219 06:05:53.172131 2059048 command_runner.go:130] > Change: 2025-12-19 06:05:53.119867628 +0000
	I1219 06:05:53.172148 2059048 command_runner.go:130] >  Birth: -
	I1219 06:05:53.172331 2059048 start.go:564] Will wait 60s for crictl version
	I1219 06:05:53.172432 2059048 ssh_runner.go:195] Run: which crictl
	I1219 06:05:53.175887 2059048 command_runner.go:130] > /usr/local/bin/crictl
	I1219 06:05:53.176199 2059048 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1219 06:05:53.203136 2059048 command_runner.go:130] > Version:  0.1.0
	I1219 06:05:53.203389 2059048 command_runner.go:130] > RuntimeName:  containerd
	I1219 06:05:53.203588 2059048 command_runner.go:130] > RuntimeVersion:  v2.2.0
	I1219 06:05:53.203784 2059048 command_runner.go:130] > RuntimeApiVersion:  v1
	I1219 06:05:53.207710 2059048 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.2.0
	RuntimeApiVersion:  v1
	I1219 06:05:53.207845 2059048 ssh_runner.go:195] Run: containerd --version
	I1219 06:05:53.235328 2059048 command_runner.go:130] > containerd containerd.io v2.2.0 1c4457e00facac03ce1d75f7b6777a7a851e5c41
	I1219 06:05:53.237219 2059048 ssh_runner.go:195] Run: containerd --version
	I1219 06:05:53.254490 2059048 command_runner.go:130] > containerd containerd.io v2.2.0 1c4457e00facac03ce1d75f7b6777a7a851e5c41
	I1219 06:05:53.262101 2059048 out.go:179] * Preparing Kubernetes v1.35.0-rc.1 on containerd 2.2.0 ...
	I1219 06:05:53.264978 2059048 cli_runner.go:164] Run: docker network inspect functional-006924 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1219 06:05:53.280549 2059048 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1219 06:05:53.284647 2059048 command_runner.go:130] > 192.168.49.1	host.minikube.internal
	I1219 06:05:53.284847 2059048 kubeadm.go:884] updating cluster {Name:functional-006924 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APIServerName:minikubeC
A APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false C
ustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1219 06:05:53.284979 2059048 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1219 06:05:53.285048 2059048 ssh_runner.go:195] Run: sudo crictl images --output json
	I1219 06:05:53.307306 2059048 command_runner.go:130] > {
	I1219 06:05:53.307331 2059048 command_runner.go:130] >   "images":  [
	I1219 06:05:53.307335 2059048 command_runner.go:130] >     {
	I1219 06:05:53.307345 2059048 command_runner.go:130] >       "id":  "sha256:b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c",
	I1219 06:05:53.307350 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.307356 2059048 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20250512-df8de77b"
	I1219 06:05:53.307360 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307365 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.307373 2059048 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a"
	I1219 06:05:53.307380 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307385 2059048 command_runner.go:130] >       "size":  "40636774",
	I1219 06:05:53.307391 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.307395 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.307402 2059048 command_runner.go:130] >     },
	I1219 06:05:53.307405 2059048 command_runner.go:130] >     {
	I1219 06:05:53.307417 2059048 command_runner.go:130] >       "id":  "sha256:ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6",
	I1219 06:05:53.307425 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.307431 2059048 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1219 06:05:53.307435 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307441 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.307450 2059048 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"
	I1219 06:05:53.307455 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307460 2059048 command_runner.go:130] >       "size":  "8034419",
	I1219 06:05:53.307463 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.307467 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.307470 2059048 command_runner.go:130] >     },
	I1219 06:05:53.307482 2059048 command_runner.go:130] >     {
	I1219 06:05:53.307492 2059048 command_runner.go:130] >       "id":  "sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf",
	I1219 06:05:53.307496 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.307501 2059048 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.13.1"
	I1219 06:05:53.307505 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307519 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.307528 2059048 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6"
	I1219 06:05:53.307534 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307538 2059048 command_runner.go:130] >       "size":  "21168808",
	I1219 06:05:53.307542 2059048 command_runner.go:130] >       "username":  "nonroot",
	I1219 06:05:53.307546 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.307549 2059048 command_runner.go:130] >     },
	I1219 06:05:53.307552 2059048 command_runner.go:130] >     {
	I1219 06:05:53.307559 2059048 command_runner.go:130] >       "id":  "sha256:271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57",
	I1219 06:05:53.307565 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.307570 2059048 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.6-0"
	I1219 06:05:53.307581 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307585 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.307592 2059048 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890"
	I1219 06:05:53.307598 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307602 2059048 command_runner.go:130] >       "size":  "21749640",
	I1219 06:05:53.307610 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.307614 2059048 command_runner.go:130] >         "value":  "0"
	I1219 06:05:53.307618 2059048 command_runner.go:130] >       },
	I1219 06:05:53.307622 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.307625 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.307631 2059048 command_runner.go:130] >     },
	I1219 06:05:53.307634 2059048 command_runner.go:130] >     {
	I1219 06:05:53.307641 2059048 command_runner.go:130] >       "id":  "sha256:3c6ba27e07aef16adb050828695bfe6206439147b9ade2a2a1777c276bf79a54",
	I1219 06:05:53.307647 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.307653 2059048 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.35.0-rc.1"
	I1219 06:05:53.307666 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307670 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.307682 2059048 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:58367b5c0428495c0c12411fa7a018f5d40fe57307b85d8935b1ed35706ff7ee"
	I1219 06:05:53.307689 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307693 2059048 command_runner.go:130] >       "size":  "24692223",
	I1219 06:05:53.307697 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.307708 2059048 command_runner.go:130] >         "value":  "0"
	I1219 06:05:53.307712 2059048 command_runner.go:130] >       },
	I1219 06:05:53.307716 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.307723 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.307726 2059048 command_runner.go:130] >     },
	I1219 06:05:53.307729 2059048 command_runner.go:130] >     {
	I1219 06:05:53.307736 2059048 command_runner.go:130] >       "id":  "sha256:a34b3483f25ba81aa72f3aeb607a8c756479e8497d8420acbcd2854162ebf84a",
	I1219 06:05:53.307742 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.307748 2059048 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.35.0-rc.1"
	I1219 06:05:53.307753 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307757 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.307765 2059048 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:57ab0f75f58d99f4be7bff7bdda015fcbf1b7c20e58ba2722c8c39f751dc8c98"
	I1219 06:05:53.307769 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307773 2059048 command_runner.go:130] >       "size":  "20672157",
	I1219 06:05:53.307779 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.307783 2059048 command_runner.go:130] >         "value":  "0"
	I1219 06:05:53.307788 2059048 command_runner.go:130] >       },
	I1219 06:05:53.307792 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.307796 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.307799 2059048 command_runner.go:130] >     },
	I1219 06:05:53.307802 2059048 command_runner.go:130] >     {
	I1219 06:05:53.307809 2059048 command_runner.go:130] >       "id":  "sha256:7e3acea3d87aa7ca234514e7f9c10450c7a7f87fc273fc9b5a220e2a2be1ce4e",
	I1219 06:05:53.307813 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.307821 2059048 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.35.0-rc.1"
	I1219 06:05:53.307826 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307830 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.307840 2059048 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:bdd1fa8b53558a2e1967379a36b085c93faf15581e5fa9f212baf679d89c5bb5"
	I1219 06:05:53.307845 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307849 2059048 command_runner.go:130] >       "size":  "22432301",
	I1219 06:05:53.307858 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.307864 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.307867 2059048 command_runner.go:130] >     },
	I1219 06:05:53.307870 2059048 command_runner.go:130] >     {
	I1219 06:05:53.307877 2059048 command_runner.go:130] >       "id":  "sha256:abca4d5226620be2218c3971464a1066651a743008c1db8720353446a4b7bbde",
	I1219 06:05:53.307884 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.307889 2059048 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.35.0-rc.1"
	I1219 06:05:53.307893 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307899 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.307907 2059048 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:8155e3db27c7081abfc8eb5da70820cfeaf0bba7449e45360e8220e670f417d3"
	I1219 06:05:53.307913 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307917 2059048 command_runner.go:130] >       "size":  "15405535",
	I1219 06:05:53.307921 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.307925 2059048 command_runner.go:130] >         "value":  "0"
	I1219 06:05:53.307928 2059048 command_runner.go:130] >       },
	I1219 06:05:53.307932 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.307939 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.307942 2059048 command_runner.go:130] >     },
	I1219 06:05:53.307948 2059048 command_runner.go:130] >     {
	I1219 06:05:53.307955 2059048 command_runner.go:130] >       "id":  "sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1219 06:05:53.307963 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.307967 2059048 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1219 06:05:53.307970 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307974 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.307982 2059048 command_runner.go:130] >         "registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c"
	I1219 06:05:53.307987 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.307991 2059048 command_runner.go:130] >       "size":  "267939",
	I1219 06:05:53.307996 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.308000 2059048 command_runner.go:130] >         "value":  "65535"
	I1219 06:05:53.308004 2059048 command_runner.go:130] >       },
	I1219 06:05:53.308011 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.308015 2059048 command_runner.go:130] >       "pinned":  true
	I1219 06:05:53.308020 2059048 command_runner.go:130] >     }
	I1219 06:05:53.308027 2059048 command_runner.go:130] >   ]
	I1219 06:05:53.308030 2059048 command_runner.go:130] > }
	I1219 06:05:53.310449 2059048 containerd.go:627] all images are preloaded for containerd runtime.
	I1219 06:05:53.310472 2059048 containerd.go:534] Images already preloaded, skipping extraction
	I1219 06:05:53.310540 2059048 ssh_runner.go:195] Run: sudo crictl images --output json
	I1219 06:05:53.331271 2059048 command_runner.go:130] > {
	I1219 06:05:53.331288 2059048 command_runner.go:130] >   "images":  [
	I1219 06:05:53.331292 2059048 command_runner.go:130] >     {
	I1219 06:05:53.331304 2059048 command_runner.go:130] >       "id":  "sha256:b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c",
	I1219 06:05:53.331309 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.331314 2059048 command_runner.go:130] >         "docker.io/kindest/kindnetd:v20250512-df8de77b"
	I1219 06:05:53.331318 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331322 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.331332 2059048 command_runner.go:130] >         "docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a"
	I1219 06:05:53.331336 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331340 2059048 command_runner.go:130] >       "size":  "40636774",
	I1219 06:05:53.331350 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.331355 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.331358 2059048 command_runner.go:130] >     },
	I1219 06:05:53.331361 2059048 command_runner.go:130] >     {
	I1219 06:05:53.331369 2059048 command_runner.go:130] >       "id":  "sha256:ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6",
	I1219 06:05:53.331373 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.331378 2059048 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1219 06:05:53.331381 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331385 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.331393 2059048 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"
	I1219 06:05:53.331396 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331400 2059048 command_runner.go:130] >       "size":  "8034419",
	I1219 06:05:53.331404 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.331408 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.331411 2059048 command_runner.go:130] >     },
	I1219 06:05:53.331414 2059048 command_runner.go:130] >     {
	I1219 06:05:53.331421 2059048 command_runner.go:130] >       "id":  "sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf",
	I1219 06:05:53.331425 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.331430 2059048 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.13.1"
	I1219 06:05:53.331433 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331439 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.331447 2059048 command_runner.go:130] >         "registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6"
	I1219 06:05:53.331451 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331454 2059048 command_runner.go:130] >       "size":  "21168808",
	I1219 06:05:53.331458 2059048 command_runner.go:130] >       "username":  "nonroot",
	I1219 06:05:53.331462 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.331466 2059048 command_runner.go:130] >     },
	I1219 06:05:53.331468 2059048 command_runner.go:130] >     {
	I1219 06:05:53.331475 2059048 command_runner.go:130] >       "id":  "sha256:271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57",
	I1219 06:05:53.331479 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.331484 2059048 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.6-0"
	I1219 06:05:53.331487 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331491 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.331502 2059048 command_runner.go:130] >         "registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890"
	I1219 06:05:53.331506 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331510 2059048 command_runner.go:130] >       "size":  "21749640",
	I1219 06:05:53.331515 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.331519 2059048 command_runner.go:130] >         "value":  "0"
	I1219 06:05:53.331522 2059048 command_runner.go:130] >       },
	I1219 06:05:53.331526 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.331530 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.331533 2059048 command_runner.go:130] >     },
	I1219 06:05:53.331536 2059048 command_runner.go:130] >     {
	I1219 06:05:53.331543 2059048 command_runner.go:130] >       "id":  "sha256:3c6ba27e07aef16adb050828695bfe6206439147b9ade2a2a1777c276bf79a54",
	I1219 06:05:53.331547 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.331551 2059048 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.35.0-rc.1"
	I1219 06:05:53.331555 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331559 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.331566 2059048 command_runner.go:130] >         "registry.k8s.io/kube-apiserver@sha256:58367b5c0428495c0c12411fa7a018f5d40fe57307b85d8935b1ed35706ff7ee"
	I1219 06:05:53.331569 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331573 2059048 command_runner.go:130] >       "size":  "24692223",
	I1219 06:05:53.331577 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.331585 2059048 command_runner.go:130] >         "value":  "0"
	I1219 06:05:53.331592 2059048 command_runner.go:130] >       },
	I1219 06:05:53.331596 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.331600 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.331603 2059048 command_runner.go:130] >     },
	I1219 06:05:53.331606 2059048 command_runner.go:130] >     {
	I1219 06:05:53.331613 2059048 command_runner.go:130] >       "id":  "sha256:a34b3483f25ba81aa72f3aeb607a8c756479e8497d8420acbcd2854162ebf84a",
	I1219 06:05:53.331617 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.331622 2059048 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.35.0-rc.1"
	I1219 06:05:53.331626 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331629 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.331638 2059048 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager@sha256:57ab0f75f58d99f4be7bff7bdda015fcbf1b7c20e58ba2722c8c39f751dc8c98"
	I1219 06:05:53.331641 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331645 2059048 command_runner.go:130] >       "size":  "20672157",
	I1219 06:05:53.331652 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.331656 2059048 command_runner.go:130] >         "value":  "0"
	I1219 06:05:53.331659 2059048 command_runner.go:130] >       },
	I1219 06:05:53.331663 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.331666 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.331669 2059048 command_runner.go:130] >     },
	I1219 06:05:53.331672 2059048 command_runner.go:130] >     {
	I1219 06:05:53.331679 2059048 command_runner.go:130] >       "id":  "sha256:7e3acea3d87aa7ca234514e7f9c10450c7a7f87fc273fc9b5a220e2a2be1ce4e",
	I1219 06:05:53.331683 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.331688 2059048 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.35.0-rc.1"
	I1219 06:05:53.331691 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331695 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.331702 2059048 command_runner.go:130] >         "registry.k8s.io/kube-proxy@sha256:bdd1fa8b53558a2e1967379a36b085c93faf15581e5fa9f212baf679d89c5bb5"
	I1219 06:05:53.331705 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331709 2059048 command_runner.go:130] >       "size":  "22432301",
	I1219 06:05:53.331713 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.331717 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.331720 2059048 command_runner.go:130] >     },
	I1219 06:05:53.331723 2059048 command_runner.go:130] >     {
	I1219 06:05:53.331733 2059048 command_runner.go:130] >       "id":  "sha256:abca4d5226620be2218c3971464a1066651a743008c1db8720353446a4b7bbde",
	I1219 06:05:53.331737 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.331742 2059048 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.35.0-rc.1"
	I1219 06:05:53.331745 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331749 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.331757 2059048 command_runner.go:130] >         "registry.k8s.io/kube-scheduler@sha256:8155e3db27c7081abfc8eb5da70820cfeaf0bba7449e45360e8220e670f417d3"
	I1219 06:05:53.331760 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331764 2059048 command_runner.go:130] >       "size":  "15405535",
	I1219 06:05:53.331767 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.331771 2059048 command_runner.go:130] >         "value":  "0"
	I1219 06:05:53.331774 2059048 command_runner.go:130] >       },
	I1219 06:05:53.331778 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.331782 2059048 command_runner.go:130] >       "pinned":  false
	I1219 06:05:53.331785 2059048 command_runner.go:130] >     },
	I1219 06:05:53.331792 2059048 command_runner.go:130] >     {
	I1219 06:05:53.331799 2059048 command_runner.go:130] >       "id":  "sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1219 06:05:53.331803 2059048 command_runner.go:130] >       "repoTags":  [
	I1219 06:05:53.331807 2059048 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1219 06:05:53.331811 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331815 2059048 command_runner.go:130] >       "repoDigests":  [
	I1219 06:05:53.331822 2059048 command_runner.go:130] >         "registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c"
	I1219 06:05:53.331826 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.331829 2059048 command_runner.go:130] >       "size":  "267939",
	I1219 06:05:53.331833 2059048 command_runner.go:130] >       "uid":  {
	I1219 06:05:53.331837 2059048 command_runner.go:130] >         "value":  "65535"
	I1219 06:05:53.331841 2059048 command_runner.go:130] >       },
	I1219 06:05:53.331845 2059048 command_runner.go:130] >       "username":  "",
	I1219 06:05:53.331849 2059048 command_runner.go:130] >       "pinned":  true
	I1219 06:05:53.331852 2059048 command_runner.go:130] >     }
	I1219 06:05:53.331855 2059048 command_runner.go:130] >   ]
	I1219 06:05:53.331858 2059048 command_runner.go:130] > }
	I1219 06:05:53.333541 2059048 containerd.go:627] all images are preloaded for containerd runtime.
	I1219 06:05:53.333565 2059048 cache_images.go:86] Images are preloaded, skipping loading
	I1219 06:05:53.333574 2059048 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-rc.1 containerd true true} ...
	I1219 06:05:53.333694 2059048 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-rc.1/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-006924 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1219 06:05:53.333773 2059048 ssh_runner.go:195] Run: sudo crictl --timeout=10s info
	I1219 06:05:53.354890 2059048 command_runner.go:130] > {
	I1219 06:05:53.354909 2059048 command_runner.go:130] >   "cniconfig": {
	I1219 06:05:53.354915 2059048 command_runner.go:130] >     "Networks": [
	I1219 06:05:53.354919 2059048 command_runner.go:130] >       {
	I1219 06:05:53.354926 2059048 command_runner.go:130] >         "Config": {
	I1219 06:05:53.354932 2059048 command_runner.go:130] >           "CNIVersion": "0.3.1",
	I1219 06:05:53.354937 2059048 command_runner.go:130] >           "Name": "cni-loopback",
	I1219 06:05:53.354941 2059048 command_runner.go:130] >           "Plugins": [
	I1219 06:05:53.354945 2059048 command_runner.go:130] >             {
	I1219 06:05:53.354949 2059048 command_runner.go:130] >               "Network": {
	I1219 06:05:53.354953 2059048 command_runner.go:130] >                 "ipam": {},
	I1219 06:05:53.354958 2059048 command_runner.go:130] >                 "type": "loopback"
	I1219 06:05:53.354962 2059048 command_runner.go:130] >               },
	I1219 06:05:53.354967 2059048 command_runner.go:130] >               "Source": "{\"type\":\"loopback\"}"
	I1219 06:05:53.354971 2059048 command_runner.go:130] >             }
	I1219 06:05:53.354975 2059048 command_runner.go:130] >           ],
	I1219 06:05:53.354988 2059048 command_runner.go:130] >           "Source": "{\n\"cniVersion\": \"0.3.1\",\n\"name\": \"cni-loopback\",\n\"plugins\": [{\n  \"type\": \"loopback\"\n}]\n}"
	I1219 06:05:53.354992 2059048 command_runner.go:130] >         },
	I1219 06:05:53.354997 2059048 command_runner.go:130] >         "IFName": "lo"
	I1219 06:05:53.355000 2059048 command_runner.go:130] >       }
	I1219 06:05:53.355003 2059048 command_runner.go:130] >     ],
	I1219 06:05:53.355007 2059048 command_runner.go:130] >     "PluginConfDir": "/etc/cni/net.d",
	I1219 06:05:53.355011 2059048 command_runner.go:130] >     "PluginDirs": [
	I1219 06:05:53.355015 2059048 command_runner.go:130] >       "/opt/cni/bin"
	I1219 06:05:53.355027 2059048 command_runner.go:130] >     ],
	I1219 06:05:53.355031 2059048 command_runner.go:130] >     "PluginMaxConfNum": 1,
	I1219 06:05:53.355036 2059048 command_runner.go:130] >     "Prefix": "eth"
	I1219 06:05:53.355039 2059048 command_runner.go:130] >   },
	I1219 06:05:53.355042 2059048 command_runner.go:130] >   "config": {
	I1219 06:05:53.355046 2059048 command_runner.go:130] >     "cdiSpecDirs": [
	I1219 06:05:53.355050 2059048 command_runner.go:130] >       "/etc/cdi",
	I1219 06:05:53.355059 2059048 command_runner.go:130] >       "/var/run/cdi"
	I1219 06:05:53.355062 2059048 command_runner.go:130] >     ],
	I1219 06:05:53.355066 2059048 command_runner.go:130] >     "cni": {
	I1219 06:05:53.355070 2059048 command_runner.go:130] >       "binDir": "",
	I1219 06:05:53.355073 2059048 command_runner.go:130] >       "binDirs": [
	I1219 06:05:53.355077 2059048 command_runner.go:130] >         "/opt/cni/bin"
	I1219 06:05:53.355080 2059048 command_runner.go:130] >       ],
	I1219 06:05:53.355084 2059048 command_runner.go:130] >       "confDir": "/etc/cni/net.d",
	I1219 06:05:53.355088 2059048 command_runner.go:130] >       "confTemplate": "",
	I1219 06:05:53.355091 2059048 command_runner.go:130] >       "ipPref": "",
	I1219 06:05:53.355095 2059048 command_runner.go:130] >       "maxConfNum": 1,
	I1219 06:05:53.355099 2059048 command_runner.go:130] >       "setupSerially": false,
	I1219 06:05:53.355103 2059048 command_runner.go:130] >       "useInternalLoopback": false
	I1219 06:05:53.355106 2059048 command_runner.go:130] >     },
	I1219 06:05:53.355114 2059048 command_runner.go:130] >     "containerd": {
	I1219 06:05:53.355119 2059048 command_runner.go:130] >       "defaultRuntimeName": "runc",
	I1219 06:05:53.355123 2059048 command_runner.go:130] >       "ignoreBlockIONotEnabledErrors": false,
	I1219 06:05:53.355128 2059048 command_runner.go:130] >       "ignoreRdtNotEnabledErrors": false,
	I1219 06:05:53.355132 2059048 command_runner.go:130] >       "runtimes": {
	I1219 06:05:53.355136 2059048 command_runner.go:130] >         "runc": {
	I1219 06:05:53.355140 2059048 command_runner.go:130] >           "ContainerAnnotations": null,
	I1219 06:05:53.355145 2059048 command_runner.go:130] >           "PodAnnotations": null,
	I1219 06:05:53.355151 2059048 command_runner.go:130] >           "baseRuntimeSpec": "",
	I1219 06:05:53.355155 2059048 command_runner.go:130] >           "cgroupWritable": false,
	I1219 06:05:53.355159 2059048 command_runner.go:130] >           "cniConfDir": "",
	I1219 06:05:53.355163 2059048 command_runner.go:130] >           "cniMaxConfNum": 0,
	I1219 06:05:53.355167 2059048 command_runner.go:130] >           "io_type": "",
	I1219 06:05:53.355171 2059048 command_runner.go:130] >           "options": {
	I1219 06:05:53.355174 2059048 command_runner.go:130] >             "BinaryName": "",
	I1219 06:05:53.355179 2059048 command_runner.go:130] >             "CriuImagePath": "",
	I1219 06:05:53.355183 2059048 command_runner.go:130] >             "CriuWorkPath": "",
	I1219 06:05:53.355187 2059048 command_runner.go:130] >             "IoGid": 0,
	I1219 06:05:53.355190 2059048 command_runner.go:130] >             "IoUid": 0,
	I1219 06:05:53.355198 2059048 command_runner.go:130] >             "NoNewKeyring": false,
	I1219 06:05:53.355201 2059048 command_runner.go:130] >             "Root": "",
	I1219 06:05:53.355205 2059048 command_runner.go:130] >             "ShimCgroup": "",
	I1219 06:05:53.355210 2059048 command_runner.go:130] >             "SystemdCgroup": false
	I1219 06:05:53.355214 2059048 command_runner.go:130] >           },
	I1219 06:05:53.355219 2059048 command_runner.go:130] >           "privileged_without_host_devices": false,
	I1219 06:05:53.355225 2059048 command_runner.go:130] >           "privileged_without_host_devices_all_devices_allowed": false,
	I1219 06:05:53.355229 2059048 command_runner.go:130] >           "runtimePath": "",
	I1219 06:05:53.355233 2059048 command_runner.go:130] >           "runtimeType": "io.containerd.runc.v2",
	I1219 06:05:53.355238 2059048 command_runner.go:130] >           "sandboxer": "podsandbox",
	I1219 06:05:53.355242 2059048 command_runner.go:130] >           "snapshotter": ""
	I1219 06:05:53.355245 2059048 command_runner.go:130] >         }
	I1219 06:05:53.355248 2059048 command_runner.go:130] >       }
	I1219 06:05:53.355252 2059048 command_runner.go:130] >     },
	I1219 06:05:53.355262 2059048 command_runner.go:130] >     "containerdEndpoint": "/run/containerd/containerd.sock",
	I1219 06:05:53.355267 2059048 command_runner.go:130] >     "containerdRootDir": "/var/lib/containerd",
	I1219 06:05:53.355273 2059048 command_runner.go:130] >     "device_ownership_from_security_context": false,
	I1219 06:05:53.355277 2059048 command_runner.go:130] >     "disableApparmor": false,
	I1219 06:05:53.355282 2059048 command_runner.go:130] >     "disableHugetlbController": true,
	I1219 06:05:53.355286 2059048 command_runner.go:130] >     "disableProcMount": false,
	I1219 06:05:53.355290 2059048 command_runner.go:130] >     "drainExecSyncIOTimeout": "0s",
	I1219 06:05:53.355294 2059048 command_runner.go:130] >     "enableCDI": true,
	I1219 06:05:53.355298 2059048 command_runner.go:130] >     "enableSelinux": false,
	I1219 06:05:53.355302 2059048 command_runner.go:130] >     "enableUnprivilegedICMP": true,
	I1219 06:05:53.355306 2059048 command_runner.go:130] >     "enableUnprivilegedPorts": true,
	I1219 06:05:53.355311 2059048 command_runner.go:130] >     "ignoreDeprecationWarnings": null,
	I1219 06:05:53.355319 2059048 command_runner.go:130] >     "ignoreImageDefinedVolumes": false,
	I1219 06:05:53.355323 2059048 command_runner.go:130] >     "maxContainerLogLineSize": 16384,
	I1219 06:05:53.355328 2059048 command_runner.go:130] >     "netnsMountsUnderStateDir": false,
	I1219 06:05:53.355332 2059048 command_runner.go:130] >     "restrictOOMScoreAdj": false,
	I1219 06:05:53.355338 2059048 command_runner.go:130] >     "rootDir": "/var/lib/containerd/io.containerd.grpc.v1.cri",
	I1219 06:05:53.355342 2059048 command_runner.go:130] >     "selinuxCategoryRange": 1024,
	I1219 06:05:53.355347 2059048 command_runner.go:130] >     "stateDir": "/run/containerd/io.containerd.grpc.v1.cri",
	I1219 06:05:53.355357 2059048 command_runner.go:130] >     "tolerateMissingHugetlbController": true,
	I1219 06:05:53.355362 2059048 command_runner.go:130] >     "unsetSeccompProfile": ""
	I1219 06:05:53.355365 2059048 command_runner.go:130] >   },
	I1219 06:05:53.355369 2059048 command_runner.go:130] >   "features": {
	I1219 06:05:53.355373 2059048 command_runner.go:130] >     "supplemental_groups_policy": true
	I1219 06:05:53.355376 2059048 command_runner.go:130] >   },
	I1219 06:05:53.355379 2059048 command_runner.go:130] >   "golang": "go1.24.9",
	I1219 06:05:53.355389 2059048 command_runner.go:130] >   "lastCNILoadStatus": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1219 06:05:53.355399 2059048 command_runner.go:130] >   "lastCNILoadStatus.default": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1219 06:05:53.355402 2059048 command_runner.go:130] >   "runtimeHandlers": [
	I1219 06:05:53.355406 2059048 command_runner.go:130] >     {
	I1219 06:05:53.355409 2059048 command_runner.go:130] >       "features": {
	I1219 06:05:53.355414 2059048 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1219 06:05:53.355418 2059048 command_runner.go:130] >         "user_namespaces": true
	I1219 06:05:53.355421 2059048 command_runner.go:130] >       }
	I1219 06:05:53.355424 2059048 command_runner.go:130] >     },
	I1219 06:05:53.355427 2059048 command_runner.go:130] >     {
	I1219 06:05:53.355431 2059048 command_runner.go:130] >       "features": {
	I1219 06:05:53.355436 2059048 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1219 06:05:53.355440 2059048 command_runner.go:130] >         "user_namespaces": true
	I1219 06:05:53.355443 2059048 command_runner.go:130] >       },
	I1219 06:05:53.355447 2059048 command_runner.go:130] >       "name": "runc"
	I1219 06:05:53.355449 2059048 command_runner.go:130] >     }
	I1219 06:05:53.355452 2059048 command_runner.go:130] >   ],
	I1219 06:05:53.355456 2059048 command_runner.go:130] >   "status": {
	I1219 06:05:53.355460 2059048 command_runner.go:130] >     "conditions": [
	I1219 06:05:53.355463 2059048 command_runner.go:130] >       {
	I1219 06:05:53.355467 2059048 command_runner.go:130] >         "message": "",
	I1219 06:05:53.355471 2059048 command_runner.go:130] >         "reason": "",
	I1219 06:05:53.355475 2059048 command_runner.go:130] >         "status": true,
	I1219 06:05:53.355480 2059048 command_runner.go:130] >         "type": "RuntimeReady"
	I1219 06:05:53.355483 2059048 command_runner.go:130] >       },
	I1219 06:05:53.355486 2059048 command_runner.go:130] >       {
	I1219 06:05:53.355495 2059048 command_runner.go:130] >         "message": "Network plugin returns error: cni plugin not initialized",
	I1219 06:05:53.355500 2059048 command_runner.go:130] >         "reason": "NetworkPluginNotReady",
	I1219 06:05:53.355504 2059048 command_runner.go:130] >         "status": false,
	I1219 06:05:53.355508 2059048 command_runner.go:130] >         "type": "NetworkReady"
	I1219 06:05:53.355512 2059048 command_runner.go:130] >       },
	I1219 06:05:53.355515 2059048 command_runner.go:130] >       {
	I1219 06:05:53.355536 2059048 command_runner.go:130] >         "message": "{\"io.containerd.deprecation/cgroup-v1\":\"The support for cgroup v1 is deprecated since containerd v2.2 and will be removed by no later than May 2029. Upgrade the host to use cgroup v2.\"}",
	I1219 06:05:53.355541 2059048 command_runner.go:130] >         "reason": "ContainerdHasDeprecationWarnings",
	I1219 06:05:53.355547 2059048 command_runner.go:130] >         "status": false,
	I1219 06:05:53.355552 2059048 command_runner.go:130] >         "type": "ContainerdHasNoDeprecationWarnings"
	I1219 06:05:53.355555 2059048 command_runner.go:130] >       }
	I1219 06:05:53.355557 2059048 command_runner.go:130] >     ]
	I1219 06:05:53.355560 2059048 command_runner.go:130] >   }
	I1219 06:05:53.355563 2059048 command_runner.go:130] > }
	I1219 06:05:53.357747 2059048 cni.go:84] Creating CNI manager for ""
	I1219 06:05:53.357770 2059048 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1219 06:05:53.357795 2059048 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1219 06:05:53.357824 2059048 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-rc.1 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-006924 NodeName:functional-006924 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt Stati
cPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1219 06:05:53.357938 2059048 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-006924"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-rc.1
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1219 06:05:53.358021 2059048 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-rc.1
	I1219 06:05:53.365051 2059048 command_runner.go:130] > kubeadm
	I1219 06:05:53.365070 2059048 command_runner.go:130] > kubectl
	I1219 06:05:53.365074 2059048 command_runner.go:130] > kubelet
	I1219 06:05:53.366033 2059048 binaries.go:51] Found k8s binaries, skipping transfer
	I1219 06:05:53.366118 2059048 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1219 06:05:53.373810 2059048 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (326 bytes)
	I1219 06:05:53.386231 2059048 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (357 bytes)
	I1219 06:05:53.399156 2059048 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2235 bytes)
	I1219 06:05:53.411832 2059048 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1219 06:05:53.415476 2059048 command_runner.go:130] > 192.168.49.2	control-plane.minikube.internal
	I1219 06:05:53.415580 2059048 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1219 06:05:53.524736 2059048 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1219 06:05:53.900522 2059048 certs.go:69] Setting up /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924 for IP: 192.168.49.2
	I1219 06:05:53.900547 2059048 certs.go:195] generating shared ca certs ...
	I1219 06:05:53.900563 2059048 certs.go:227] acquiring lock for ca certs: {Name:mk382c71693ea4061363f97b153b21bf6cdf5f38 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1219 06:05:53.900702 2059048 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.key
	I1219 06:05:53.900780 2059048 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/proxy-client-ca.key
	I1219 06:05:53.900803 2059048 certs.go:257] generating profile certs ...
	I1219 06:05:53.900908 2059048 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.key
	I1219 06:05:53.900976 2059048 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.key.febe6fed
	I1219 06:05:53.901024 2059048 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/proxy-client.key
	I1219 06:05:53.901037 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I1219 06:05:53.901081 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I1219 06:05:53.901098 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I1219 06:05:53.901109 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I1219 06:05:53.901127 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I1219 06:05:53.901139 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I1219 06:05:53.901154 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I1219 06:05:53.901171 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I1219 06:05:53.901229 2059048 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/2000386.pem (1338 bytes)
	W1219 06:05:53.901264 2059048 certs.go:480] ignoring /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/2000386_empty.pem, impossibly tiny 0 bytes
	I1219 06:05:53.901277 2059048 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca-key.pem (1679 bytes)
	I1219 06:05:53.901306 2059048 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem (1078 bytes)
	I1219 06:05:53.901333 2059048 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/cert.pem (1123 bytes)
	I1219 06:05:53.901365 2059048 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/key.pem (1671 bytes)
	I1219 06:05:53.901418 2059048 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem (1708 bytes)
	I1219 06:05:53.901449 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:05:53.901465 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/2000386.pem -> /usr/share/ca-certificates/2000386.pem
	I1219 06:05:53.901481 2059048 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem -> /usr/share/ca-certificates/20003862.pem
	I1219 06:05:53.902039 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1219 06:05:53.926748 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1219 06:05:53.945718 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1219 06:05:53.964111 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I1219 06:05:53.984388 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1219 06:05:54.005796 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I1219 06:05:54.027058 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1219 06:05:54.045330 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I1219 06:05:54.062681 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1219 06:05:54.080390 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/2000386.pem --> /usr/share/ca-certificates/2000386.pem (1338 bytes)
	I1219 06:05:54.102399 2059048 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem --> /usr/share/ca-certificates/20003862.pem (1708 bytes)
	I1219 06:05:54.120580 2059048 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (722 bytes)
	I1219 06:05:54.133732 2059048 ssh_runner.go:195] Run: openssl version
	I1219 06:05:54.139799 2059048 command_runner.go:130] > OpenSSL 3.0.17 1 Jul 2025 (Library: OpenSSL 3.0.17 1 Jul 2025)
	I1219 06:05:54.140191 2059048 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/20003862.pem
	I1219 06:05:54.147812 2059048 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/20003862.pem /etc/ssl/certs/20003862.pem
	I1219 06:05:54.155315 2059048 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/20003862.pem
	I1219 06:05:54.159037 2059048 command_runner.go:130] > -rw-r--r-- 1 root root 1708 Dec 19 05:57 /usr/share/ca-certificates/20003862.pem
	I1219 06:05:54.159108 2059048 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 19 05:57 /usr/share/ca-certificates/20003862.pem
	I1219 06:05:54.159165 2059048 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/20003862.pem
	I1219 06:05:54.200029 2059048 command_runner.go:130] > 3ec20f2e
	I1219 06:05:54.200546 2059048 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1219 06:05:54.208733 2059048 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:05:54.216254 2059048 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1219 06:05:54.224240 2059048 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:05:54.228059 2059048 command_runner.go:130] > -rw-r--r-- 1 root root 1111 Dec 19 05:43 /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:05:54.228165 2059048 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 19 05:43 /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:05:54.228244 2059048 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:05:54.268794 2059048 command_runner.go:130] > b5213941
	I1219 06:05:54.269372 2059048 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1219 06:05:54.277054 2059048 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/2000386.pem
	I1219 06:05:54.284467 2059048 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/2000386.pem /etc/ssl/certs/2000386.pem
	I1219 06:05:54.291949 2059048 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/2000386.pem
	I1219 06:05:54.295750 2059048 command_runner.go:130] > -rw-r--r-- 1 root root 1338 Dec 19 05:57 /usr/share/ca-certificates/2000386.pem
	I1219 06:05:54.295798 2059048 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 19 05:57 /usr/share/ca-certificates/2000386.pem
	I1219 06:05:54.295849 2059048 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/2000386.pem
	I1219 06:05:54.341163 2059048 command_runner.go:130] > 51391683
	I1219 06:05:54.341782 2059048 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1219 06:05:54.349497 2059048 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1219 06:05:54.353229 2059048 command_runner.go:130] >   File: /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1219 06:05:54.353253 2059048 command_runner.go:130] >   Size: 1176      	Blocks: 8          IO Block: 4096   regular file
	I1219 06:05:54.353261 2059048 command_runner.go:130] > Device: 259,1	Inode: 1582667     Links: 1
	I1219 06:05:54.353268 2059048 command_runner.go:130] > Access: (0644/-rw-r--r--)  Uid: (    0/    root)   Gid: (    0/    root)
	I1219 06:05:54.353275 2059048 command_runner.go:130] > Access: 2025-12-19 06:01:47.245300782 +0000
	I1219 06:05:54.353281 2059048 command_runner.go:130] > Modify: 2025-12-19 05:57:42.198721757 +0000
	I1219 06:05:54.353286 2059048 command_runner.go:130] > Change: 2025-12-19 05:57:42.198721757 +0000
	I1219 06:05:54.353294 2059048 command_runner.go:130] >  Birth: 2025-12-19 05:57:42.198721757 +0000
	I1219 06:05:54.353372 2059048 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1219 06:05:54.398897 2059048 command_runner.go:130] > Certificate will not expire
	I1219 06:05:54.399374 2059048 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1219 06:05:54.440111 2059048 command_runner.go:130] > Certificate will not expire
	I1219 06:05:54.440565 2059048 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1219 06:05:54.481409 2059048 command_runner.go:130] > Certificate will not expire
	I1219 06:05:54.481968 2059048 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1219 06:05:54.522576 2059048 command_runner.go:130] > Certificate will not expire
	I1219 06:05:54.523020 2059048 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1219 06:05:54.563365 2059048 command_runner.go:130] > Certificate will not expire
	I1219 06:05:54.563892 2059048 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1219 06:05:54.604428 2059048 command_runner.go:130] > Certificate will not expire
	I1219 06:05:54.604920 2059048 kubeadm.go:401] StartCluster: {Name:functional-006924 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APIServerName:minikubeCA A
PIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false Cust
omQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1219 06:05:54.605002 2059048 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1219 06:05:54.605063 2059048 ssh_runner.go:195] Run: sudo -s eval "crictl --timeout=10s ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1219 06:05:54.631433 2059048 cri.go:92] found id: ""
	I1219 06:05:54.631512 2059048 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1219 06:05:54.638289 2059048 command_runner.go:130] > /var/lib/kubelet/config.yaml
	I1219 06:05:54.638353 2059048 command_runner.go:130] > /var/lib/kubelet/kubeadm-flags.env
	I1219 06:05:54.638374 2059048 command_runner.go:130] > /var/lib/minikube/etcd:
	I1219 06:05:54.639191 2059048 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1219 06:05:54.639207 2059048 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1219 06:05:54.639278 2059048 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1219 06:05:54.646289 2059048 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1219 06:05:54.646704 2059048 kubeconfig.go:47] verify endpoint returned: get endpoint: "functional-006924" does not appear in /home/jenkins/minikube-integration/22230-1998525/kubeconfig
	I1219 06:05:54.646809 2059048 kubeconfig.go:62] /home/jenkins/minikube-integration/22230-1998525/kubeconfig needs updating (will repair): [kubeconfig missing "functional-006924" cluster setting kubeconfig missing "functional-006924" context setting]
	I1219 06:05:54.647118 2059048 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22230-1998525/kubeconfig: {Name:mk7db1732c7d76f01100426cb283dc7515a3b9ea Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1219 06:05:54.647542 2059048 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/22230-1998525/kubeconfig
	I1219 06:05:54.647700 2059048 kapi.go:59] client config for functional-006924: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.crt", KeyFile:"/home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.key", CAFile:"/home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(ni
l), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1ffe230), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1219 06:05:54.648289 2059048 envvar.go:172] "Feature gate default state" feature="ClientsPreferCBOR" enabled=false
	I1219 06:05:54.648312 2059048 envvar.go:172] "Feature gate default state" feature="InOrderInformers" enabled=true
	I1219 06:05:54.648318 2059048 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I1219 06:05:54.648377 2059048 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I1219 06:05:54.648389 2059048 envvar.go:172] "Feature gate default state" feature="ClientsAllowCBOR" enabled=false
	I1219 06:05:54.648357 2059048 cert_rotation.go:141] "Starting client certificate rotation controller" logger="tls-transport-cache"
	I1219 06:05:54.648779 2059048 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1219 06:05:54.659696 2059048 kubeadm.go:635] The running cluster does not require reconfiguration: 192.168.49.2
	I1219 06:05:54.659739 2059048 kubeadm.go:602] duration metric: took 20.517186ms to restartPrimaryControlPlane
	I1219 06:05:54.659750 2059048 kubeadm.go:403] duration metric: took 54.838405ms to StartCluster
	I1219 06:05:54.659766 2059048 settings.go:142] acquiring lock: {Name:mk0fb518a1861caea9ce90c087e9f98ff93c6842 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1219 06:05:54.659859 2059048 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/22230-1998525/kubeconfig
	I1219 06:05:54.660602 2059048 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22230-1998525/kubeconfig: {Name:mk7db1732c7d76f01100426cb283dc7515a3b9ea Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1219 06:05:54.660878 2059048 start.go:236] Will wait 6m0s for node &{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I1219 06:05:54.661080 2059048 config.go:182] Loaded profile config "functional-006924": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1219 06:05:54.661197 2059048 addons.go:543] enable addons start: toEnable=map[ambassador:false amd-gpu-device-plugin:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I1219 06:05:54.661465 2059048 addons.go:70] Setting storage-provisioner=true in profile "functional-006924"
	I1219 06:05:54.661481 2059048 addons.go:239] Setting addon storage-provisioner=true in "functional-006924"
	I1219 06:05:54.661506 2059048 host.go:66] Checking if "functional-006924" exists ...
	I1219 06:05:54.661954 2059048 cli_runner.go:164] Run: docker container inspect functional-006924 --format={{.State.Status}}
	I1219 06:05:54.662128 2059048 addons.go:70] Setting default-storageclass=true in profile "functional-006924"
	I1219 06:05:54.662158 2059048 addons_storage_classes.go:34] enableOrDisableStorageClasses default-storageclass=true on "functional-006924"
	I1219 06:05:54.662427 2059048 cli_runner.go:164] Run: docker container inspect functional-006924 --format={{.State.Status}}
	I1219 06:05:54.667300 2059048 out.go:179] * Verifying Kubernetes components...
	I1219 06:05:54.673650 2059048 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1219 06:05:54.689683 2059048 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/22230-1998525/kubeconfig
	I1219 06:05:54.689848 2059048 kapi.go:59] client config for functional-006924: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.crt", KeyFile:"/home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.key", CAFile:"/home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(ni
l), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1ffe230), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1219 06:05:54.690123 2059048 addons.go:239] Setting addon default-storageclass=true in "functional-006924"
	I1219 06:05:54.690152 2059048 host.go:66] Checking if "functional-006924" exists ...
	I1219 06:05:54.690560 2059048 cli_runner.go:164] Run: docker container inspect functional-006924 --format={{.State.Status}}
	I1219 06:05:54.715008 2059048 out.go:179]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I1219 06:05:54.717850 2059048 addons.go:436] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:05:54.717879 2059048 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I1219 06:05:54.717946 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:54.734767 2059048 addons.go:436] installing /etc/kubernetes/addons/storageclass.yaml
	I1219 06:05:54.734788 2059048 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I1219 06:05:54.734856 2059048 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:05:54.764236 2059048 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:05:54.773070 2059048 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:05:54.876977 2059048 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1219 06:05:54.898675 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:05:54.923995 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:05:55.652544 2059048 node_ready.go:35] waiting up to 6m0s for node "functional-006924" to be "Ready" ...
	I1219 06:05:55.652680 2059048 type.go:165] "Request Body" body=""
	I1219 06:05:55.652777 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:05:55.653088 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:55.653133 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:55.653174 2059048 retry.go:31] will retry after 152.748ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:55.653242 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:55.653274 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:55.653290 2059048 retry.go:31] will retry after 222.401366ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:55.653368 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:05:55.806850 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:05:55.871164 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:55.871241 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:55.871268 2059048 retry.go:31] will retry after 248.166368ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:55.876351 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:05:55.932419 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:55.936105 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:55.936137 2059048 retry.go:31] will retry after 191.546131ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:56.120512 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:05:56.128049 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:05:56.153420 2059048 type.go:165] "Request Body" body=""
	I1219 06:05:56.153544 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:05:56.153844 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:05:56.188805 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:56.192400 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:56.192475 2059048 retry.go:31] will retry after 421.141509ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:56.203130 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:56.203228 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:56.203252 2059048 retry.go:31] will retry after 495.708783ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:56.614800 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:05:56.653236 2059048 type.go:165] "Request Body" body=""
	I1219 06:05:56.653361 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:05:56.653708 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:05:56.677894 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:56.677943 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:56.677993 2059048 retry.go:31] will retry after 980.857907ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:56.700099 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:05:56.755124 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:56.758623 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:56.758652 2059048 retry.go:31] will retry after 1.143622688s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:57.152911 2059048 type.go:165] "Request Body" body=""
	I1219 06:05:57.153042 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:05:57.153399 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:05:57.652868 2059048 type.go:165] "Request Body" body=""
	I1219 06:05:57.652947 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:05:57.653291 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:05:57.653378 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:05:57.659518 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:05:57.724667 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:57.724716 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:57.724735 2059048 retry.go:31] will retry after 900.329628ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:57.903067 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:05:57.986230 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:57.986314 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:57.986340 2059048 retry.go:31] will retry after 1.7845791s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:58.153671 2059048 type.go:165] "Request Body" body=""
	I1219 06:05:58.153749 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:05:58.154120 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:05:58.625732 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:05:58.653113 2059048 type.go:165] "Request Body" body=""
	I1219 06:05:58.653187 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:05:58.653475 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:05:58.712944 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:58.713042 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:58.713071 2059048 retry.go:31] will retry after 2.322946675s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:59.153740 2059048 type.go:165] "Request Body" body=""
	I1219 06:05:59.153822 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:05:59.154186 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:05:59.652843 2059048 type.go:165] "Request Body" body=""
	I1219 06:05:59.652927 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:05:59.653311 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:05:59.771577 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:05:59.835749 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:05:59.839442 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:05:59.839476 2059048 retry.go:31] will retry after 2.412907222s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:00.152821 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:00.152949 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:00.153306 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:00.153393 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:00.653320 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:00.653404 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:00.653734 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:01.036322 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:06:01.102362 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:01.106179 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:01.106214 2059048 retry.go:31] will retry after 2.139899672s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:01.153490 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:01.153572 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:01.153855 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:01.653656 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:01.653732 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:01.654026 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:02.152793 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:02.152870 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:02.153204 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:02.252582 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:06:02.312437 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:02.312479 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:02.312500 2059048 retry.go:31] will retry after 1.566668648s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:02.652881 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:02.652958 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:02.653230 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:02.653283 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:03.152957 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:03.153054 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:03.153393 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:03.246844 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:06:03.302237 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:03.305728 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:03.305771 2059048 retry.go:31] will retry after 6.170177016s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:03.653408 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:03.653482 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:03.653834 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:03.880237 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:06:03.939688 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:03.939736 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:03.939756 2059048 retry.go:31] will retry after 4.919693289s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:04.153025 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:04.153101 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:04.153368 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:04.653333 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:04.653405 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:04.653716 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:04.653762 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:05.153589 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:05.153680 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:05.154012 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:05.652866 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:05.652946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:05.653271 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:06.152846 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:06.152922 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:06.153277 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:06.652913 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:06.652987 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:06.653350 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:07.152875 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:07.152947 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:07.153248 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:07.153305 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:07.652846 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:07.652924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:07.653261 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:08.152864 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:08.152938 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:08.153236 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:08.652856 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:08.652924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:08.653179 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:08.859603 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:06:08.923746 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:08.923802 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:08.923824 2059048 retry.go:31] will retry after 7.49455239s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:09.153273 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:09.153361 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:09.153733 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:09.153794 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:09.476166 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:06:09.536340 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:09.536378 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:09.536397 2059048 retry.go:31] will retry after 3.264542795s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:09.652787 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:09.652863 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:09.653191 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:10.152879 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:10.152955 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:10.153217 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:10.653092 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:10.653172 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:10.653505 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:11.153189 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:11.153267 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:11.153564 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:11.653360 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:11.653432 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:11.653748 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:11.653809 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:12.153584 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:12.153667 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:12.154066 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:12.652816 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:12.652897 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:12.653232 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:12.801732 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:06:12.858668 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:12.858722 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:12.858742 2059048 retry.go:31] will retry after 7.015856992s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:13.152872 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:13.152946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:13.153207 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:13.652838 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:13.652915 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:13.653206 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:14.152876 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:14.152957 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:14.153277 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:14.153340 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:14.653224 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:14.653299 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:14.653566 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:15.153381 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:15.153458 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:15.153856 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:15.653715 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:15.653796 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:15.654137 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:16.153469 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:16.153543 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:16.153826 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:16.153868 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:16.419404 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:06:16.476671 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:16.480081 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:16.480119 2059048 retry.go:31] will retry after 7.9937579s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:16.653575 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:16.653716 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:16.653985 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:17.153751 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:17.153850 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:17.154134 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:17.652872 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:17.652939 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:17.653201 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:18.152893 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:18.152976 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:18.153301 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:18.652860 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:18.652932 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:18.653233 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:18.653289 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:19.152869 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:19.152958 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:19.153227 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:19.652934 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:19.653010 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:19.653354 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:19.875781 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:06:19.950537 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:19.954067 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:19.954097 2059048 retry.go:31] will retry after 12.496952157s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:20.153579 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:20.153656 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:20.154027 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:20.652751 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:20.652846 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:20.653178 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:21.153037 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:21.153112 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:21.153446 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:21.153504 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:21.652852 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:21.652937 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:21.653238 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:22.152882 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:22.152951 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:22.153207 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:22.652851 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:22.652925 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:22.653261 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:23.152820 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:23.152899 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:23.153240 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:23.652818 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:23.652892 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:23.653158 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:23.653200 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:24.152783 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:24.152857 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:24.153198 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:24.474774 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:06:24.538538 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:24.538585 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:24.538605 2059048 retry.go:31] will retry after 14.635173495s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:24.653139 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:24.653215 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:24.653538 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:25.153284 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:25.153354 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:25.153661 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:25.653607 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:25.653689 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:25.653986 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:25.654040 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:26.152728 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:26.152857 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:26.153186 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:26.652777 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:26.652852 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:26.653175 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:27.152839 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:27.152916 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:27.153210 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:27.652846 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:27.652930 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:27.653284 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:28.152874 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:28.152956 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:28.153232 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:28.153286 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:28.652849 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:28.652924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:28.653242 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:29.152838 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:29.152930 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:29.153299 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:29.652870 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:29.652945 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:29.653242 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:30.152853 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:30.152937 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:30.153290 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:30.153348 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:30.653022 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:30.653115 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:30.653416 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:31.152873 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:31.152960 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:31.153275 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:31.652985 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:31.653057 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:31.653405 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:32.152845 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:32.152921 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:32.153253 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:32.451758 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:06:32.506473 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:32.509966 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:32.509998 2059048 retry.go:31] will retry after 31.028140902s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:32.653234 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:32.653309 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:32.653632 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:32.653749 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:33.153497 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:33.153583 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:33.153949 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:33.652732 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:33.652832 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:33.653182 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:34.152891 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:34.152964 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:34.153263 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:34.653098 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:34.653173 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:34.653525 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:35.153363 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:35.153489 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:35.153845 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:35.153907 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:35.653568 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:35.653649 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:35.653928 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:36.153725 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:36.153800 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:36.154115 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:36.652850 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:36.652933 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:36.653274 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:37.153419 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:37.153492 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:37.153866 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:37.153952 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:37.653726 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:37.653797 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:37.654143 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:38.152858 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:38.152933 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:38.153263 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:38.652866 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:38.652935 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:38.653195 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:39.152856 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:39.152932 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:39.153265 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:39.174643 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:06:39.239291 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:39.239335 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:39.239354 2059048 retry.go:31] will retry after 15.420333699s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:39.652870 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:39.652944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:39.653261 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:39.653316 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:40.152873 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:40.152963 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:40.153285 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:40.653056 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:40.653131 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:40.653494 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:41.153188 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:41.153263 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:41.153588 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:41.652865 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:41.652932 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:41.653248 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:42.152875 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:42.152964 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:42.153379 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:42.153461 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:42.652919 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:42.653000 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:42.653314 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:43.152873 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:43.152946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:43.153240 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:43.652956 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:43.653027 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:43.653379 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:44.152963 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:44.153044 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:44.153381 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:44.653201 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:44.653284 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:44.653550 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:44.653592 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:45.153794 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:45.153882 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:45.154325 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:45.653107 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:45.653190 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:45.653497 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:46.152858 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:46.152925 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:46.153184 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:46.652851 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:46.652926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:46.653256 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:47.152863 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:47.152944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:47.153246 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:47.153293 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:47.652875 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:47.652964 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:47.653331 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:48.152820 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:48.152904 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:48.153254 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:48.652976 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:48.653054 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:48.653401 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:49.152928 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:49.153003 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:49.153264 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:49.652842 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:49.652922 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:49.653266 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:49.653325 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:50.152835 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:50.152916 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:50.153230 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:50.653021 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:50.653097 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:50.653360 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:51.152819 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:51.152892 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:51.153216 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:51.652938 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:51.653021 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:51.653350 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:51.653404 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:52.152878 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:52.152997 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:52.153340 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:52.653054 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:52.653126 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:52.653428 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:53.152809 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:53.152885 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:53.153202 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:53.652872 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:53.652946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:53.653212 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:54.152921 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:54.153000 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:54.153307 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:54.153361 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:54.653430 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:54.653504 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:54.653886 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:54.660097 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:06:54.724740 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:06:54.724806 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:54.724824 2059048 retry.go:31] will retry after 21.489743806s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:06:55.153047 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:55.153170 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:55.153542 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:55.653137 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:55.653210 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:55.653500 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:56.153216 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:56.153285 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:56.153620 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:56.153682 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:56.653420 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:56.653501 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:56.653832 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:57.153605 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:57.153702 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:57.154020 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:57.652746 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:57.652846 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:57.653184 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:58.152882 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:58.152958 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:58.153278 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:58.652843 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:58.652924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:58.653216 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:06:58.653262 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:06:59.152798 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:59.152874 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:59.153218 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:06:59.652867 2059048 type.go:165] "Request Body" body=""
	I1219 06:06:59.652934 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:06:59.653193 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:00.155125 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:00.155210 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:00.156183 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:00.653343 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:00.653416 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:00.653737 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:00.653787 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:01.152970 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:01.153062 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:01.153389 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:01.652835 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:01.652908 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:01.653245 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:02.152861 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:02.152952 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:02.153330 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:02.652900 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:02.652986 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:02.653368 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:03.152826 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:03.152908 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:03.153251 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:03.153306 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:03.538820 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:07:03.598261 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:07:03.602187 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:07:03.602221 2059048 retry.go:31] will retry after 27.693032791s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:07:03.653410 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:03.653486 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:03.653840 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:04.153298 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:04.153371 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:04.153670 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:04.653539 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:04.653618 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:04.653956 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:05.153749 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:05.153837 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:05.154160 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:05.154219 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:05.653149 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:05.653217 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:05.653546 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:06.153378 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:06.153468 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:06.153799 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:06.653420 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:06.653494 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:06.653803 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:07.153113 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:07.153187 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:07.153451 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:07.652897 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:07.652979 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:07.653292 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:07.653351 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:08.152825 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:08.152901 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:08.153274 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:08.652859 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:08.652926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:08.653238 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:09.153667 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:09.153756 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:09.154076 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:09.652824 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:09.652899 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:09.653232 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:10.153341 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:10.153410 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:10.153757 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:10.153818 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:10.653710 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:10.653802 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:10.654164 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:11.152777 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:11.152862 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:11.153199 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:11.652877 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:11.652946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:11.653219 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:12.152831 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:12.152908 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:12.153224 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:12.652832 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:12.652911 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:12.653226 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:12.653273 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:13.152877 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:13.152951 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:13.153279 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:13.652822 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:13.652904 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:13.653241 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:14.152814 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:14.152897 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:14.153218 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:14.653177 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:14.653250 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:14.653558 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:14.653611 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:15.153356 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:15.153436 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:15.153788 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:15.652725 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:15.652816 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:15.653161 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:16.152875 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:16.152948 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:16.153206 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:16.215537 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:07:16.273841 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:07:16.273881 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:07:16.273899 2059048 retry.go:31] will retry after 30.872906877s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1219 06:07:16.653514 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:16.653598 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:16.653919 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:16.653970 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:17.153579 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:17.153656 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:17.153994 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:17.653597 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:17.653665 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:17.653945 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:18.152782 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:18.152859 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:18.153155 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:18.652836 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:18.652926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:18.653269 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:19.152861 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:19.152939 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:19.153239 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:19.153292 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:19.652841 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:19.652916 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:19.653250 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:20.152836 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:20.152910 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:20.153258 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:20.653266 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:20.653354 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:20.653711 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:21.153511 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:21.153585 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:21.153886 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:21.153948 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:21.653690 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:21.653776 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:21.654081 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:22.153312 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:22.153387 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:22.153749 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:22.653581 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:22.653661 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:22.654117 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:23.152715 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:23.152802 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:23.153141 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:23.652867 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:23.652936 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:23.653196 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:23.653241 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:24.152848 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:24.152934 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:24.153277 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:24.653127 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:24.653203 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:24.653560 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:25.153321 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:25.153393 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:25.153662 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:25.652744 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:25.652855 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:25.653239 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:25.653298 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:26.152968 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:26.153049 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:26.153397 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:26.652880 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:26.652963 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:26.653275 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:27.152811 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:27.152888 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:27.153207 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:27.652936 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:27.653013 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:27.653346 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:27.653407 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:28.152886 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:28.152961 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:28.153298 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:28.652829 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:28.652904 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:28.653240 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:29.152818 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:29.152890 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:29.153229 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:29.652862 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:29.652947 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:29.653200 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:30.152832 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:30.152918 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:30.153263 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:30.153321 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:30.652996 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:30.653069 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:30.653387 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:31.152872 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:31.152950 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:31.153277 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:31.295743 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1219 06:07:31.365905 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:07:31.365953 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:07:31.366070 2059048 out.go:285] ! Enabling 'default-storageclass' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1219 06:07:31.653338 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:31.653413 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:31.653757 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:32.153415 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:32.153519 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:32.153862 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:32.153934 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:32.653181 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:32.653249 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:32.653512 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:33.152815 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:33.152899 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:33.153193 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:33.652817 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:33.652892 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:33.653195 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:34.152859 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:34.152941 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:34.153251 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:34.653155 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:34.653231 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:34.653574 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:34.653631 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:35.153386 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:35.153461 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:35.153800 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:35.652767 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:35.652837 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:35.653104 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:36.152871 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:36.152949 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:36.153263 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:36.652848 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:36.652927 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:36.653291 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:37.152893 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:37.152978 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:37.153238 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:37.153278 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:37.652927 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:37.653002 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:37.653295 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:38.152985 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:38.153059 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:38.153404 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:38.652889 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:38.652964 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:38.653220 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:39.152836 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:39.152920 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:39.153233 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:39.153294 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:39.652812 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:39.652889 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:39.653215 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:40.152857 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:40.152932 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:40.153187 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:40.653073 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:40.653148 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:40.653479 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:41.152804 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:41.152885 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:41.153180 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:41.652771 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:41.652841 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:41.653154 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:41.653206 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:42.152884 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:42.152963 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:42.153327 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:42.652853 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:42.652951 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:42.653271 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:43.152861 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:43.152931 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:43.153232 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:43.652842 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:43.652918 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:43.653242 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:43.653314 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:44.152994 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:44.153073 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:44.153402 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:44.653413 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:44.653502 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:44.653799 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:45.153668 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:45.153801 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:45.154199 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:45.653080 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:45.653158 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:45.653481 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:45.653538 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:46.153253 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:46.153372 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:46.153620 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:46.653410 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:46.653505 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:46.653901 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:47.147624 2059048 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1219 06:07:47.153238 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:47.153313 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:47.153618 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:47.207245 2059048 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:07:47.207289 2059048 addons.go:479] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1219 06:07:47.207381 2059048 out.go:285] ! Enabling 'storage-provisioner' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1219 06:07:47.212201 2059048 out.go:179] * Enabled addons: 
	I1219 06:07:47.215092 2059048 addons.go:546] duration metric: took 1m52.553895373s for enable addons: enabled=[]
	I1219 06:07:47.652840 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:47.652917 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:47.653177 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:48.152861 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:48.152936 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:48.153274 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:48.153336 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:48.652862 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:48.652941 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:48.653266 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:49.152877 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:49.152950 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:49.153222 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:49.652844 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:49.652940 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:49.653312 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:50.152823 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:50.152906 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:50.153448 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:50.153518 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:50.653031 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:50.653101 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:50.653362 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:51.153113 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:51.153194 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:51.153608 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:51.653414 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:51.653487 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:51.653829 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:52.153249 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:52.153337 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:52.153602 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:52.153645 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:52.653349 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:52.653422 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:52.653735 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:53.153542 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:53.153620 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:53.153960 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:53.653712 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:53.653793 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:53.654088 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:54.153153 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:54.153246 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:54.153650 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:54.153714 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:54.653597 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:54.653675 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:54.654059 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:55.153390 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:55.153470 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:55.153780 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:55.652892 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:55.652968 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:55.653343 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:56.153064 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:56.153144 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:56.153504 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:56.652934 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:56.653001 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:56.653305 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:56.653374 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:57.152823 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:57.152898 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:57.153265 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:57.652979 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:57.653054 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:57.653394 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:58.152861 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:58.152935 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:58.153224 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:58.652841 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:58.652916 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:58.653243 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:07:59.152863 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:59.152942 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:59.153252 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:07:59.153305 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:07:59.652871 2059048 type.go:165] "Request Body" body=""
	I1219 06:07:59.652957 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:07:59.653221 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:00.152926 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:00.153011 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:00.153341 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:00.653583 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:00.653664 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:00.654050 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:01.153430 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:01.153505 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:01.153842 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:01.153899 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:01.653658 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:01.653734 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:01.654077 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:02.152810 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:02.152894 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:02.153236 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:02.652906 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:02.652992 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:02.653255 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:03.152814 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:03.152911 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:03.153241 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:03.652846 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:03.652946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:03.653311 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:03.653372 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:04.152873 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:04.152947 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:04.153272 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:04.653311 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:04.653393 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:04.653691 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:05.153373 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:05.153449 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:05.153786 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:05.653505 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:05.653577 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:05.653866 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:05.653911 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:06.153684 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:06.153763 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:06.154116 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:06.652849 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:06.652928 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:06.653265 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:07.152801 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:07.152875 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:07.153140 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:07.652821 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:07.652904 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:07.653239 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:08.152989 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:08.153069 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:08.153365 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:08.153412 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:08.652920 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:08.652987 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:08.653255 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:09.152797 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:09.152875 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:09.153215 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:09.652928 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:09.653026 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:09.653367 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:10.152861 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:10.152928 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:10.153183 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:10.653055 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:10.653153 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:10.653481 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:10.653536 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:11.152864 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:11.152945 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:11.153256 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:11.652878 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:11.652952 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:11.653279 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:12.152810 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:12.152885 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:12.153241 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:12.652960 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:12.653057 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:12.653342 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:13.152864 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:13.152943 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:13.153205 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:13.153250 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:13.652804 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:13.652895 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:13.653245 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:14.152963 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:14.153048 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:14.153407 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:14.653382 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:14.653457 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:14.653810 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:15.153570 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:15.153650 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:15.153993 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:15.154055 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:15.652797 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:15.652875 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:15.653205 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:16.152855 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:16.152927 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:16.153181 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:16.652842 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:16.652938 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:16.653237 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:17.152851 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:17.152930 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:17.153255 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:17.652875 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:17.652952 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:17.653273 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:17.653327 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:18.152822 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:18.152897 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:18.153211 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:18.652850 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:18.652929 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:18.653226 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:19.152862 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:19.152939 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:19.153206 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:19.652841 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:19.652943 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:19.653281 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:20.152968 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:20.153059 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:20.153353 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:20.153402 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:20.652914 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:20.652990 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:20.653265 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:21.152823 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:21.152899 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:21.153199 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:21.652903 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:21.652980 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:21.653286 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:22.152864 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:22.152936 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:22.153212 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:22.652848 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:22.652926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:22.653257 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:22.653311 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:23.152987 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:23.153082 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:23.153450 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:23.652860 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:23.652933 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:23.653264 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:24.152875 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:24.152959 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:24.153259 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:24.653188 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:24.653259 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:24.653621 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:24.653676 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:25.152855 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:25.152925 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:25.153189 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:25.653096 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:25.653176 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:25.653514 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:26.153303 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:26.153380 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:26.153718 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:26.653504 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:26.653583 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:26.653866 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:26.653917 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:27.153641 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:27.153723 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:27.154070 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:27.653768 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:27.653851 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:27.654198 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:28.152880 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:28.152951 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:28.153263 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:28.652865 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:28.652944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:28.653286 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:29.152996 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:29.153076 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:29.153423 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:29.153485 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:29.652870 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:29.652937 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:29.653208 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:30.152848 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:30.152931 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:30.153247 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:30.653099 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:30.653178 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:30.653543 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:31.152864 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:31.152933 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:31.153184 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:31.652836 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:31.652916 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:31.653254 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:31.653310 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:32.152845 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:32.152923 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:32.153234 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:32.652869 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:32.652974 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:32.653291 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:33.152813 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:33.152892 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:33.153182 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:33.652873 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:33.652952 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:33.653279 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:33.653339 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:34.152888 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:34.152957 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:34.153206 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:34.653221 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:34.653303 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:34.653662 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:35.153491 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:35.153567 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:35.153923 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:35.653686 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:35.653756 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:35.654034 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:35.654075 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:36.152742 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:36.152852 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:36.153178 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:36.652917 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:36.652991 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:36.653328 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:37.152872 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:37.152944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:37.153269 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:37.652832 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:37.652905 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:37.653225 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:38.152823 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:38.152901 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:38.153256 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:38.153311 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:38.652878 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:38.652945 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:38.653254 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:39.152844 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:39.152917 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:39.153253 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:39.652865 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:39.652948 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:39.653281 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:40.152846 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:40.152936 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:40.153201 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:40.653085 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:40.653160 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:40.653488 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:40.653543 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:41.152787 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:41.152870 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:41.153181 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:41.652770 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:41.652846 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:41.653122 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:42.152887 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:42.152974 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:42.153376 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:42.653103 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:42.653188 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:42.653511 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:42.653570 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:43.152867 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:43.152944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:43.153205 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:43.652856 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:43.652941 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:43.653311 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:44.153027 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:44.153105 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:44.153433 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:44.653459 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:44.653530 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:44.653788 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:44.653840 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:45.153678 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:45.153766 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:45.156105 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=2
	I1219 06:08:45.653114 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:45.653196 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:45.653533 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:46.152871 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:46.152948 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:46.153207 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:46.652877 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:46.652950 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:46.653258 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:47.152995 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:47.153106 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:47.153459 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:47.153515 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:47.652878 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:47.652955 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:47.653273 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:48.152852 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:48.152926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:48.153282 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:48.652874 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:48.652954 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:48.653318 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:49.152869 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:49.152946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:49.153202 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:49.652841 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:49.652924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:49.653230 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:49.653282 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:50.152954 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:50.153027 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:50.153317 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:50.653024 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:50.653102 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:50.653365 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:51.152850 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:51.152924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:51.153219 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:51.652848 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:51.652934 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:51.653275 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:51.653343 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:52.152878 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:52.152957 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:52.153276 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:52.652974 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:52.653059 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:52.653395 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:53.153096 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:53.153174 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:53.153508 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:53.652872 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:53.652951 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:53.653281 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:54.152839 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:54.152916 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:54.153228 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:54.153274 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:54.653199 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:54.653273 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:54.653604 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:55.153420 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:55.153510 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:55.153789 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:55.652811 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:55.652903 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:55.653294 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:56.152856 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:56.152934 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:56.153223 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:56.652860 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:56.652926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:56.653259 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:56.653316 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:57.152833 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:57.152911 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:57.153205 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:57.652906 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:57.652981 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:57.653323 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:58.152864 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:58.152937 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:58.153209 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:58.652840 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:58.652922 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:58.653217 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:08:59.152905 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:59.152981 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:59.153315 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:08:59.153381 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:08:59.652855 2059048 type.go:165] "Request Body" body=""
	I1219 06:08:59.652932 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:08:59.653183 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:00.152906 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:00.153005 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:00.153357 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:00.653205 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:00.653291 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:00.653625 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:01.153283 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:01.153360 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:01.153628 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:01.153671 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:01.653416 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:01.653497 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:01.653884 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:02.153557 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:02.153633 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:02.154010 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:02.652736 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:02.652830 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:02.653106 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:03.152802 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:03.152877 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:03.153215 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:03.652921 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:03.652999 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:03.653309 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:03.653358 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:04.152885 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:04.152955 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:04.153320 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:04.653344 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:04.653416 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:04.653746 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:05.153560 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:05.153640 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:05.153974 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:05.652768 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:05.652867 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:05.653179 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:06.152859 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:06.152935 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:06.153224 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:06.153272 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:06.652921 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:06.653000 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:06.653306 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:07.152883 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:07.152957 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:07.153227 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:07.652826 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:07.652898 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:07.653243 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:08.152838 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:08.152914 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:08.153262 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:08.153318 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:08.652911 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:08.652979 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:08.653282 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:09.152916 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:09.152986 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:09.153263 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:09.652875 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:09.652954 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:09.653284 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:10.152865 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:10.152942 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:10.153226 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:10.653112 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:10.653192 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:10.653511 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:10.653577 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:11.153350 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:11.153429 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:11.153777 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:11.653085 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:11.653162 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:11.653429 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:12.152806 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:12.152904 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:12.153213 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:12.652905 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:12.652980 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:12.653311 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:13.152912 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:13.152981 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:13.153307 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:13.153367 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:13.653051 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:13.653127 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:13.653415 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:14.152821 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:14.152917 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:14.153215 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:14.653028 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:14.653106 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:14.653360 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:15.152838 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:15.152912 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:15.153210 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:15.653006 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:15.653081 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:15.653415 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:15.653467 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:16.152865 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:16.152934 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:16.153234 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:16.652830 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:16.652908 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:16.653204 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:17.152901 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:17.152974 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:17.153276 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:17.652850 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:17.652923 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:17.653180 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:18.152820 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:18.152893 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:18.153221 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:18.153274 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:18.652906 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:18.652988 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:18.653283 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:19.152887 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:19.152961 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:19.153207 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:19.652913 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:19.652992 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:19.653321 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:20.153018 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:20.153092 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:20.153437 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:20.153501 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:20.653011 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:20.653093 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:20.653372 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:21.153069 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:21.153153 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:21.153484 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:21.652850 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:21.652938 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:21.653322 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:22.153458 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:22.153530 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:22.153790 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:22.153833 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:22.653650 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:22.653724 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:22.654057 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:23.152779 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:23.152853 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:23.153175 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:23.652866 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:23.652964 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:23.653280 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:24.152829 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:24.152918 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:24.153276 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:24.653129 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:24.653203 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:24.653539 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:24.653595 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:25.153181 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:25.153256 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:25.153525 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:25.653492 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:25.653572 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:25.653896 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:26.153709 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:26.153785 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:26.154149 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:26.653439 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:26.653511 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:26.653845 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:26.653912 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:27.153641 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:27.153711 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:27.154059 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:27.653737 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:27.653813 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:27.654171 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:28.152737 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:28.152824 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:28.153136 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:28.652857 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:28.652929 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:28.653267 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:29.152816 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:29.152890 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:29.153304 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:29.153363 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:29.652877 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:29.652949 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:29.653201 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:30.152847 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:30.152926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:30.153286 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:30.653142 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:30.653226 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:30.653576 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:31.152865 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:31.152937 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:31.153239 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:31.652853 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:31.652930 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:31.653285 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:31.653341 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:32.153002 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:32.153077 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:32.153389 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:32.652855 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:32.652928 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:32.653191 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:33.152906 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:33.152985 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:33.153320 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:33.653030 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:33.653112 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:33.653463 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:33.653523 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:34.152867 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:34.152936 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:34.153191 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:34.653266 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:34.653343 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:34.653688 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:35.153480 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:35.153562 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:35.153920 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:35.653700 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:35.653779 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:35.654078 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:35.654124 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:36.152813 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:36.152899 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:36.153244 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:36.652824 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:36.652902 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:36.653244 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:37.152873 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:37.152947 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:37.153200 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:37.652845 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:37.652924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:37.653218 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:38.152831 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:38.152912 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:38.153208 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:38.153253 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:38.652887 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:38.652966 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:38.653228 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:39.152830 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:39.152913 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:39.153278 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:39.652866 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:39.652946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:39.653299 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:40.152861 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:40.152934 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:40.153228 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:40.153287 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:40.653031 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:40.653107 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:40.653447 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:41.152846 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:41.152920 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:41.153249 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:41.652882 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:41.652956 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:41.653222 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:42.152858 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:42.152950 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:42.153290 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:42.153350 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:42.653039 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:42.653114 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:42.653443 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:43.152861 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:43.152929 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:43.153196 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:43.652842 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:43.652917 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:43.653298 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:44.153021 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:44.153098 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:44.153446 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:44.153502 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:44.653394 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:44.653463 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:44.653758 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:45.153716 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:45.153844 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:45.154316 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:45.653443 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:45.653522 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:45.653863 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:46.153623 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:46.153701 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:46.153971 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:46.154014 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:46.653765 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:46.653843 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:46.654187 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:47.152841 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:47.152918 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:47.153244 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:47.652861 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:47.652938 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:47.653190 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:48.152864 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:48.152954 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:48.153355 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:48.653077 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:48.653151 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:48.653475 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:48.653535 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:49.152855 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:49.152929 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:49.153189 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:49.652829 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:49.652905 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:49.653255 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:50.152968 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:50.153052 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:50.153380 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:50.653140 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:50.653211 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:50.653679 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:50.653731 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:51.153473 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:51.153550 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:51.154738 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=1
	I1219 06:09:51.653546 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:51.653618 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:51.653958 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:52.153274 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:52.153349 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:52.153606 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:52.653351 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:52.653426 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:52.653752 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:52.653808 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:53.153430 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:53.153501 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:53.153810 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:53.653040 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:53.653137 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:53.653483 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:54.152833 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:54.152911 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:54.153264 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:54.652950 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:54.653032 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:54.653335 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:55.152873 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:55.152958 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:55.153273 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:55.153315 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:55.653562 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:55.653634 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:55.653988 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:56.152721 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:56.152824 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:56.153183 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:56.652868 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:56.652946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:56.653204 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:57.152895 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:57.152971 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:57.153299 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:57.153359 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:57.652826 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:57.652904 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:57.653235 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:58.152872 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:58.152942 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:58.153268 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:58.652975 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:58.653058 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:58.653396 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:09:59.153105 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:59.153184 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:59.153474 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:09:59.153520 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:09:59.652864 2059048 type.go:165] "Request Body" body=""
	I1219 06:09:59.652944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:09:59.653267 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:00.152931 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:00.153036 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:00.153356 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:00.653257 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:00.653334 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:00.653658 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:01.153453 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:01.153528 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:01.153794 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:01.153845 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:01.653619 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:01.653697 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:01.654088 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:02.153731 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:02.153810 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:02.154155 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:02.652784 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:02.652868 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:02.653195 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:03.152838 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:03.152917 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:03.153278 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:03.652980 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:03.653056 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:03.653404 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:03.653465 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:04.152871 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:04.152950 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:04.153292 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:04.653187 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:04.653267 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:04.653580 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:05.152837 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:05.152916 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:05.153255 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:05.652984 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:05.653057 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:05.653348 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:06.153037 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:06.153117 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:06.153467 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:06.153522 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:06.653186 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:06.653261 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:06.653599 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:07.152860 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:07.152929 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:07.153189 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:07.652828 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:07.652907 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:07.653267 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:08.152859 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:08.152944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:08.153282 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:08.652877 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:08.652946 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:08.653201 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:08.653242 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:09.152822 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:09.152898 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:09.153263 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:09.652859 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:09.652947 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:09.653294 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:10.152871 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:10.152950 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:10.153225 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:10.653128 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:10.653226 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:10.653575 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:10.653636 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:11.153427 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:11.153513 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:11.153854 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:11.653325 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:11.653393 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:11.653695 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:12.153493 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:12.153567 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:12.153867 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:12.653672 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:12.653754 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:12.654079 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:12.654129 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:13.152786 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:13.152870 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:13.153265 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:13.652847 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:13.652923 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:13.653234 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:14.152794 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:14.152866 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:14.153184 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:14.653147 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:14.653224 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:14.653478 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:15.152818 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:15.152898 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:15.153253 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:15.153301 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:15.653095 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:15.653176 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:15.653536 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:16.152872 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:16.152948 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:16.153210 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:16.652894 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:16.652982 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:16.653351 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:17.152883 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:17.152963 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:17.153312 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:17.153367 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:17.652869 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:17.652943 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:17.653235 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:18.152825 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:18.152903 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:18.153264 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:18.652851 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:18.652930 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:18.653272 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:19.152844 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:19.152921 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:19.153179 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:19.652851 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:19.652930 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:19.653284 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:19.653346 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:20.152854 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:20.152932 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:20.153299 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:20.653045 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:20.653125 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:20.653445 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:21.153150 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:21.153227 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:21.153575 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:21.652847 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:21.652931 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:21.653260 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:22.152864 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:22.152940 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:22.153207 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:22.153260 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:22.652895 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:22.652974 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:22.653308 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:23.152830 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:23.152911 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:23.153198 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:23.653479 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:23.653551 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:23.653818 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:24.153710 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:24.153800 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:24.154142 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:24.154201 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:24.653230 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:24.653310 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:24.653643 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:25.153415 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:25.153494 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:25.153825 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:25.652780 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:25.652869 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:25.653238 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:26.152955 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:26.153029 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:26.153332 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:26.652857 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:26.652926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:26.653203 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:26.653244 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:27.152819 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:27.152893 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:27.153237 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:27.652953 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:27.653040 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:27.653394 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:28.152864 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:28.152939 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:28.153198 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:28.652846 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:28.652921 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:28.653239 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:28.653296 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:29.153007 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:29.153109 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:29.153490 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:29.652904 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:29.653002 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:29.653393 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:30.152859 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:30.152941 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:30.153290 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:30.653036 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:30.653110 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:30.653469 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:30.653528 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:31.152866 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:31.152940 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:31.153201 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:31.652841 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:31.652917 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:31.653256 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:32.152981 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:32.153059 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:32.153421 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:32.652862 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:32.652934 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:32.653184 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:33.152815 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:33.152902 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:33.153207 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:33.153256 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:33.652858 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:33.652936 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:33.653273 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:34.153699 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:34.153778 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:34.154156 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:34.652927 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:34.653004 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:34.653344 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:35.152944 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:35.153053 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:35.153407 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:35.153473 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:35.653031 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:35.653108 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:35.653439 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:36.152985 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:36.153058 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:36.153410 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:36.652997 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:36.653074 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:36.653385 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:37.152858 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:37.152927 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:37.153178 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:37.652820 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:37.652898 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:37.653238 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:37.653293 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:38.152835 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:38.152918 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:38.153307 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:38.652893 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:38.652984 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:38.653356 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:39.152830 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:39.152909 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:39.153250 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:39.652830 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:39.652914 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:39.653251 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:40.152861 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:40.152939 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:40.153215 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:40.153267 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:40.653044 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:40.653127 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:40.653472 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:41.152846 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:41.152924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:41.153236 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:41.652860 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:41.652938 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:41.653202 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:42.152888 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:42.152978 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:42.153360 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:42.153425 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:42.653108 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:42.653190 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:42.653535 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:43.153293 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:43.153377 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:43.153699 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:43.653511 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:43.653596 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:43.653946 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:44.153626 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:44.153701 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:44.154058 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:44.154116 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:44.653518 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:44.653586 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:44.653839 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:45.153714 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:45.153803 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:45.154242 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:45.653103 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:45.653182 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:45.653567 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:46.152856 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:46.152937 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:46.153239 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:46.652826 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:46.652901 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:46.653220 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:46.653276 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:47.153000 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:47.153090 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:47.153484 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:47.652874 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:47.652945 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:47.653216 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:48.152829 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:48.152923 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:48.153272 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:48.653001 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:48.653085 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:48.653425 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:48.653480 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:49.152866 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:49.152944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:49.153206 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:49.652909 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:49.652997 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:49.653352 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:50.152932 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:50.153010 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:50.153347 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:50.653023 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:50.653100 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:50.653383 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:51.153061 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:51.153137 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:51.153452 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:51.153512 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:51.653210 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:51.653296 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:51.653657 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:52.153489 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:52.153557 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:52.153876 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:52.653692 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:52.653768 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:52.654090 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:53.152787 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:53.152864 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:53.153157 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:53.652784 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:53.652862 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:53.653125 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:53.653167 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:54.152851 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:54.152939 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:54.153283 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:54.653187 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:54.653265 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:54.653642 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:55.153555 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:55.153638 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:55.153984 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:55.652996 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:55.653078 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:55.653399 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:55.653461 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:56.153144 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:56.153229 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:56.153575 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:56.653124 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:56.653197 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:56.653482 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:57.152912 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:57.152988 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:57.153306 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:57.652848 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:57.652930 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:57.653287 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:58.152883 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:58.152951 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:58.153228 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:10:58.153273 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:10:58.652822 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:58.652903 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:58.653236 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:59.152792 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:59.152869 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:59.153185 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:10:59.652732 2059048 type.go:165] "Request Body" body=""
	I1219 06:10:59.652827 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:10:59.653140 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:00.152931 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:00.153022 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:00.153339 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:00.153391 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:00.653248 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:00.653323 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:00.653669 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:01.153449 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:01.153521 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:01.153868 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:01.653237 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:01.653336 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:01.653684 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:02.153489 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:02.153575 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:02.153909 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:02.153978 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:02.653742 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:02.653822 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:02.654154 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:03.152847 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:03.152925 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:03.153263 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:03.652860 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:03.652939 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:03.653311 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:04.152862 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:04.152937 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:04.153190 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:04.653274 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:04.653353 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:04.653687 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:04.653753 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:05.153511 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:05.153585 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:05.153947 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:05.652710 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:05.652796 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:05.653061 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:06.152795 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:06.152871 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:06.153232 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:06.652955 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:06.653039 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:06.653379 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:07.152874 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:07.152959 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:07.153220 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:07.153260 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:07.652857 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:07.652932 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:07.653275 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:08.152847 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:08.152921 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:08.153280 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:08.652915 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:08.652997 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:08.653258 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:09.152863 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:09.152938 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:09.153313 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:09.153369 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:09.653051 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:09.653130 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:09.653466 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:10.152899 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:10.152977 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:10.153296 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:10.653043 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:10.653165 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:10.653488 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:11.152868 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:11.152944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:11.153273 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:11.653413 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:11.653483 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:11.653817 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:11.653864 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:12.153612 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:12.153686 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:12.154026 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:12.652742 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:12.652850 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:12.653128 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:13.152820 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:13.152886 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:13.153131 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:13.652843 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:13.652924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:13.653273 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:14.152867 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:14.152944 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:14.153275 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:14.153331 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:14.653226 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:14.653309 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:14.653648 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:15.153413 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:15.153488 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:15.153804 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:15.652744 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:15.652844 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:15.653142 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:16.152784 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:16.152853 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:16.153159 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:16.652816 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:16.652891 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:16.653186 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:16.653233 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:17.152845 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:17.152926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:17.153251 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:17.652870 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:17.652936 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:17.653201 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:18.152833 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:18.152914 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:18.153264 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:18.652945 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:18.653025 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:18.653318 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:18.653380 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:19.152875 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:19.152949 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:19.153231 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:19.652844 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:19.652920 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:19.653256 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:20.152860 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:20.152935 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:20.153299 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:20.653014 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:20.653092 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:20.653351 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:21.152845 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:21.152918 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:21.153277 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:21.153341 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:21.653010 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:21.653087 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:21.653481 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:22.152850 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:22.152924 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:22.153186 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:22.652828 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:22.652906 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:22.653231 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:23.152837 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:23.152923 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:23.153272 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:23.652867 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:23.652942 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:23.653201 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:23.653241 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:24.152900 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:24.152984 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:24.153313 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:24.653189 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:24.653270 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:24.653611 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:25.152919 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:25.152989 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:25.153298 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:25.653028 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:25.653101 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:25.653438 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:25.653492 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:26.153176 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:26.153256 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:26.153570 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:26.652912 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:26.652986 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:26.653378 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:27.152845 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:27.152921 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:27.153259 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:27.652839 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:27.652922 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:27.653245 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:28.152867 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:28.152938 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:28.153233 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:28.153276 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:28.652833 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:28.652907 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:28.653249 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:29.152989 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:29.153064 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:29.153396 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:29.652869 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:29.652941 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:29.653239 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:30.152845 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:30.152926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:30.153258 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:30.153317 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:30.653045 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:30.653118 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:30.653429 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:31.152869 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:31.152943 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:31.153207 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:31.652855 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:31.652930 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:31.653292 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:32.152836 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:32.152919 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:32.153241 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:32.652864 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:32.652940 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:32.653216 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:32.653263 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:33.152791 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:33.152875 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:33.153206 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:33.652896 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:33.652977 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:33.653320 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:34.152876 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:34.152959 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:34.153289 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:34.653352 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:34.653430 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:34.653807 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:34.653869 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:35.153637 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:35.153718 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:35.154044 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:35.652946 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:35.653021 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:35.653340 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:36.152965 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:36.153049 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:36.153384 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:36.653133 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:36.653213 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:36.653535 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:37.153283 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:37.153355 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:37.153669 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:37.153731 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:37.653516 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:37.653597 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:37.653938 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:38.153755 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:38.153833 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:38.154248 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:38.652881 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:38.652956 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:38.653272 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:39.152844 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:39.152925 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:39.153296 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:39.652881 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:39.652964 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:39.653289 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:39.653347 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:40.152873 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:40.152948 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:40.153212 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:40.653026 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:40.653107 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:40.653447 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:41.152846 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:41.152921 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:41.153249 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:41.652949 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:41.653030 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:41.653297 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:42.152907 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:42.153021 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:42.153435 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:42.153505 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:42.653182 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:42.653258 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:42.653594 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:43.152852 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:43.152926 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:43.153183 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:43.652858 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:43.652938 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:43.653276 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:44.152855 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:44.152938 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:44.153264 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:44.653236 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:44.653312 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:44.653574 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:44.653614 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:45.153508 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:45.153630 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:45.154114 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:45.653037 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:45.653120 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:45.653478 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:46.152854 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:46.152928 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:46.153280 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:46.652857 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:46.652941 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:46.653286 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:47.152995 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:47.153070 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:47.153369 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:47.153415 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:47.652884 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:47.652954 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:47.653305 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:48.152817 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:48.152895 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:48.153237 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:48.652848 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:48.652925 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:48.653243 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:49.152858 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:49.152943 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:49.153213 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:49.652807 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:49.652884 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:49.653230 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:49.653285 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:50.152972 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:50.153050 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:50.153344 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:50.653187 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:50.653264 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:50.653522 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:51.153213 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:51.153287 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:51.153583 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:51.653360 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:51.653435 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:51.653779 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:51.653840 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:52.153070 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:52.153143 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:52.153403 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:52.653101 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:52.653190 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:52.653483 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:53.152800 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:53.152878 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:53.153196 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:53.652868 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:53.652937 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:53.653274 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:54.152820 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:54.152901 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:54.153215 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1219 06:11:54.153274 2059048 node_ready.go:55] error getting node "functional-006924" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-006924": dial tcp 192.168.49.2:8441: connect: connection refused
	I1219 06:11:54.653291 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:54.653363 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:54.653706 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:55.152998 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:55.153089 2059048 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-006924" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1219 06:11:55.153429 2059048 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1219 06:11:55.653064 2059048 type.go:165] "Request Body" body=""
	I1219 06:11:55.653125 2059048 node_ready.go:38] duration metric: took 6m0.000540604s for node "functional-006924" to be "Ready" ...
	I1219 06:11:55.656290 2059048 out.go:203] 
	W1219 06:11:55.659114 2059048 out.go:285] X Exiting due to GUEST_START: failed to start node: wait 6m0s for node: waiting for node to be ready: WaitNodeCondition: context deadline exceeded
	W1219 06:11:55.659135 2059048 out.go:285] * 
	W1219 06:11:55.661307 2059048 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1219 06:11:55.664349 2059048 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 19 06:12:03 functional-006924 containerd[5249]: time="2025-12-19T06:12:03.448591666Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 19 06:12:04 functional-006924 containerd[5249]: time="2025-12-19T06:12:04.498334228Z" level=info msg="No images store for sha256:a1f83055284ec302ac691d8677946d8b4e772fb7071d39ada1cc9184cb70814b"
	Dec 19 06:12:04 functional-006924 containerd[5249]: time="2025-12-19T06:12:04.500470113Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:latest\""
	Dec 19 06:12:04 functional-006924 containerd[5249]: time="2025-12-19T06:12:04.508749787Z" level=info msg="ImageCreate event name:\"sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 19 06:12:04 functional-006924 containerd[5249]: time="2025-12-19T06:12:04.509292299Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:latest\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 19 06:12:05 functional-006924 containerd[5249]: time="2025-12-19T06:12:05.455765557Z" level=info msg="No images store for sha256:77bd2e9ec09b9f03e181ef448174ba62f2bf72888843372bb729abc0e9bb591d"
	Dec 19 06:12:05 functional-006924 containerd[5249]: time="2025-12-19T06:12:05.457957869Z" level=info msg="ImageCreate event name:\"docker.io/library/minikube-local-cache-test:functional-006924\""
	Dec 19 06:12:05 functional-006924 containerd[5249]: time="2025-12-19T06:12:05.465100463Z" level=info msg="ImageCreate event name:\"sha256:51d14939a1995a88415fccb269ec40dc043aefbcf5035f79ba02097bb3909863\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 19 06:12:05 functional-006924 containerd[5249]: time="2025-12-19T06:12:05.465568210Z" level=info msg="ImageUpdate event name:\"docker.io/library/minikube-local-cache-test:functional-006924\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 19 06:12:06 functional-006924 containerd[5249]: time="2025-12-19T06:12:06.284212603Z" level=info msg="RemoveImage \"registry.k8s.io/pause:latest\""
	Dec 19 06:12:06 functional-006924 containerd[5249]: time="2025-12-19T06:12:06.286648914Z" level=info msg="ImageDelete event name:\"registry.k8s.io/pause:latest\""
	Dec 19 06:12:06 functional-006924 containerd[5249]: time="2025-12-19T06:12:06.288918863Z" level=info msg="ImageDelete event name:\"sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a\""
	Dec 19 06:12:06 functional-006924 containerd[5249]: time="2025-12-19T06:12:06.302188402Z" level=info msg="RemoveImage \"registry.k8s.io/pause:latest\" returns successfully"
	Dec 19 06:12:07 functional-006924 containerd[5249]: time="2025-12-19T06:12:07.264422448Z" level=info msg="RemoveImage \"registry.k8s.io/pause:3.1\""
	Dec 19 06:12:07 functional-006924 containerd[5249]: time="2025-12-19T06:12:07.266842808Z" level=info msg="ImageDelete event name:\"registry.k8s.io/pause:3.1\""
	Dec 19 06:12:07 functional-006924 containerd[5249]: time="2025-12-19T06:12:07.269981025Z" level=info msg="ImageDelete event name:\"sha256:8057e0500773a37cde2cff041eb13ebd68c748419a2fbfd1dfb5bf38696cc8e5\""
	Dec 19 06:12:07 functional-006924 containerd[5249]: time="2025-12-19T06:12:07.276420564Z" level=info msg="RemoveImage \"registry.k8s.io/pause:3.1\" returns successfully"
	Dec 19 06:12:07 functional-006924 containerd[5249]: time="2025-12-19T06:12:07.444119016Z" level=info msg="No images store for sha256:a1f83055284ec302ac691d8677946d8b4e772fb7071d39ada1cc9184cb70814b"
	Dec 19 06:12:07 functional-006924 containerd[5249]: time="2025-12-19T06:12:07.447262763Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:latest\""
	Dec 19 06:12:07 functional-006924 containerd[5249]: time="2025-12-19T06:12:07.454411610Z" level=info msg="ImageCreate event name:\"sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 19 06:12:07 functional-006924 containerd[5249]: time="2025-12-19T06:12:07.454763778Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:latest\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 19 06:12:07 functional-006924 containerd[5249]: time="2025-12-19T06:12:07.573995122Z" level=info msg="No images store for sha256:3ac89611d5efd8eb74174b1f04c33b7e73b651cec35b5498caf0cfdd2efd7d48"
	Dec 19 06:12:07 functional-006924 containerd[5249]: time="2025-12-19T06:12:07.576369794Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.1\""
	Dec 19 06:12:07 functional-006924 containerd[5249]: time="2025-12-19T06:12:07.583465595Z" level=info msg="ImageCreate event name:\"sha256:8057e0500773a37cde2cff041eb13ebd68c748419a2fbfd1dfb5bf38696cc8e5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 19 06:12:07 functional-006924 containerd[5249]: time="2025-12-19T06:12:07.584066947Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:12:11.622646    9352 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:12:11.623099    9352 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:12:11.624929    9352 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:12:11.625394    9352 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:12:11.627086    9352 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec19 04:47] overlayfs: idmapped layers are currently not supported
	[Dec19 04:48] overlayfs: idmapped layers are currently not supported
	[Dec19 04:49] overlayfs: idmapped layers are currently not supported
	[Dec19 04:51] overlayfs: idmapped layers are currently not supported
	[Dec19 04:53] overlayfs: idmapped layers are currently not supported
	[Dec19 05:03] overlayfs: idmapped layers are currently not supported
	[Dec19 05:04] overlayfs: idmapped layers are currently not supported
	[Dec19 05:05] overlayfs: idmapped layers are currently not supported
	[Dec19 05:06] overlayfs: idmapped layers are currently not supported
	[ +12.793339] overlayfs: idmapped layers are currently not supported
	[Dec19 05:07] overlayfs: idmapped layers are currently not supported
	[Dec19 05:08] overlayfs: idmapped layers are currently not supported
	[Dec19 05:09] overlayfs: idmapped layers are currently not supported
	[Dec19 05:10] overlayfs: idmapped layers are currently not supported
	[Dec19 05:11] overlayfs: idmapped layers are currently not supported
	[Dec19 05:13] overlayfs: idmapped layers are currently not supported
	[Dec19 05:14] overlayfs: idmapped layers are currently not supported
	[Dec19 05:32] overlayfs: idmapped layers are currently not supported
	[Dec19 05:33] overlayfs: idmapped layers are currently not supported
	[Dec19 05:35] overlayfs: idmapped layers are currently not supported
	[Dec19 05:36] overlayfs: idmapped layers are currently not supported
	[Dec19 05:38] overlayfs: idmapped layers are currently not supported
	[Dec19 05:39] overlayfs: idmapped layers are currently not supported
	[Dec19 05:40] overlayfs: idmapped layers are currently not supported
	[Dec19 05:42] kauditd_printk_skb: 8 callbacks suppressed
	
	
	==> kernel <==
	 06:12:11 up 10:54,  0 user,  load average: 0.97, 0.46, 0.78
	Linux functional-006924 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 19 06:12:08 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 06:12:09 functional-006924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 827.
	Dec 19 06:12:09 functional-006924 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:12:09 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:12:09 functional-006924 kubelet[9220]: E1219 06:12:09.457177    9220 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 06:12:09 functional-006924 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 06:12:09 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 06:12:10 functional-006924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 828.
	Dec 19 06:12:10 functional-006924 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:12:10 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:12:10 functional-006924 kubelet[9237]: E1219 06:12:10.197914    9237 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 06:12:10 functional-006924 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 06:12:10 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 06:12:10 functional-006924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 829.
	Dec 19 06:12:10 functional-006924 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:12:10 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:12:10 functional-006924 kubelet[9271]: E1219 06:12:10.974245    9271 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 06:12:10 functional-006924 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 06:12:10 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 06:12:11 functional-006924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 830.
	Dec 19 06:12:11 functional-006924 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:12:11 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:12:11 functional-006924 kubelet[9359]: E1219 06:12:11.706144    9359 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 06:12:11 functional-006924 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 06:12:11 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:263: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-006924 -n functional-006924
helpers_test.go:263: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-006924 -n functional-006924: exit status 2 (391.414412ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:263: status error: exit status 2 (may be ok)
helpers_test.go:265: "functional-006924" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/MinikubeKubectlCmdDirectly (2.39s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ExtraConfig (735.64s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ExtraConfig
functional_test.go:772: (dbg) Run:  out/minikube-linux-arm64 start -p functional-006924 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all
E1219 06:15:27.488937 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/addons-064622/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 06:16:29.406748 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-125117/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 06:17:52.454246 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-125117/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 06:20:27.486050 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/addons-064622/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 06:21:29.405683 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-125117/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 06:23:30.534253 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/addons-064622/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
functional_test.go:772: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p functional-006924 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all: exit status 109 (12m13.349490383s)

                                                
                                                
-- stdout --
	* [functional-006924] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=22230
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/22230-1998525/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/22230-1998525/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the docker driver based on existing profile
	* Starting "functional-006924" primary control-plane node in "functional-006924" cluster
	* Pulling base image v0.0.48-1765966054-22186 ...
	* Preparing Kubernetes v1.35.0-rc.1 on containerd 2.2.0 ...
	  - apiserver.enable-admission-plugins=NamespaceAutoProvision
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	! Unable to restart control-plane node(s), will reset cluster: <no value>
	! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000561406s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	
	To see the stack trace of this error execute with --v=5 or higher
	
	* 
	X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000209622s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	
	To see the stack trace of this error execute with --v=5 or higher
	
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000209622s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	
	To see the stack trace of this error execute with --v=5 or higher
	
	* Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	* Related issue: https://github.com/kubernetes/minikube/issues/4172

                                                
                                                
** /stderr **
functional_test.go:774: failed to restart minikube. args "out/minikube-linux-arm64 start -p functional-006924 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all": exit status 109
functional_test.go:776: restart took 12m13.350738939s for "functional-006924" cluster.
I1219 06:24:26.046286 2000386 config.go:182] Loaded profile config "functional-006924": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:224: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ExtraConfig]: network settings <======
helpers_test.go:231: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:239: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ExtraConfig]: docker inspect <======
helpers_test.go:240: (dbg) Run:  docker inspect functional-006924
helpers_test.go:244: (dbg) docker inspect functional-006924:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6",
	        "Created": "2025-12-19T05:57:32.987616309Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 2053574,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-19T05:57:33.050252475Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:1411dfa4fea1291ce69fcd55acb99f3fbff3e701cee30fdd4f0b2561ac0ef6b0",
	        "ResolvConfPath": "/var/lib/docker/containers/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6/hostname",
	        "HostsPath": "/var/lib/docker/containers/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6/hosts",
	        "LogPath": "/var/lib/docker/containers/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6-json.log",
	        "Name": "/functional-006924",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-006924:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-006924",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6",
	                "LowerDir": "/var/lib/docker/overlay2/37e64f46ab514c62069e4338bcac4816059c7746e8935882d7526ec5f3a01f73-init/diff:/var/lib/docker/overlay2/00358d85eab3b52f9d297862c5ac97673efd866f7bb8f8781bf0c1744f50abc5/diff",
	                "MergedDir": "/var/lib/docker/overlay2/37e64f46ab514c62069e4338bcac4816059c7746e8935882d7526ec5f3a01f73/merged",
	                "UpperDir": "/var/lib/docker/overlay2/37e64f46ab514c62069e4338bcac4816059c7746e8935882d7526ec5f3a01f73/diff",
	                "WorkDir": "/var/lib/docker/overlay2/37e64f46ab514c62069e4338bcac4816059c7746e8935882d7526ec5f3a01f73/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-006924",
	                "Source": "/var/lib/docker/volumes/functional-006924/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-006924",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-006924",
	                "name.minikube.sigs.k8s.io": "functional-006924",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "c06ab2bd44169716d410789ed39ed6e7c04e20cbf7fddb96691439282b9c97ca",
	            "SandboxKey": "/var/run/docker/netns/c06ab2bd4416",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34704"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34705"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34708"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34706"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34707"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-006924": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "42:2f:87:6a:a8:7b",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "f63e8dc2cff83663f8a4d14108f192e61e457410fa4fc720cd9630dbf354815d",
	                    "EndpointID": "aa2b1cbd90d5c1f6130481423d97f82d974d4197e41ad0dbe3b7e51b22c8b4cc",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-006924",
	                        "651d0d6ef1db"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:248: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-006924 -n functional-006924
helpers_test.go:248: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-006924 -n functional-006924: exit status 2 (306.687461ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:248: status error: exit status 2 (may be ok)
helpers_test.go:253: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ExtraConfig FAILED: start of post-mortem logs <<<
helpers_test.go:254: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ExtraConfig]: minikube logs <======
helpers_test.go:256: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 logs -n 25
helpers_test.go:261: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ExtraConfig logs: 
-- stdout --
	
	==> Audit <==
	┌────────────────┬───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│    COMMAND     │                                                                         ARGS                                                                          │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├────────────────┼───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ image          │ functional-125117 image ls                                                                                                                            │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image ls --format json --alsologtostderr                                                                                            │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image ls --format table --alsologtostderr                                                                                           │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ update-context │ functional-125117 update-context --alsologtostderr -v=2                                                                                               │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ update-context │ functional-125117 update-context --alsologtostderr -v=2                                                                                               │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ update-context │ functional-125117 update-context --alsologtostderr -v=2                                                                                               │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ delete         │ -p functional-125117                                                                                                                                  │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ start          │ -p functional-006924 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-rc.1 │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │                     │
	│ start          │ -p functional-006924 --alsologtostderr -v=8                                                                                                           │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:05 UTC │                     │
	│ cache          │ functional-006924 cache add registry.k8s.io/pause:3.1                                                                                                 │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ cache          │ functional-006924 cache add registry.k8s.io/pause:3.3                                                                                                 │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ cache          │ functional-006924 cache add registry.k8s.io/pause:latest                                                                                              │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ cache          │ functional-006924 cache add minikube-local-cache-test:functional-006924                                                                               │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ cache          │ functional-006924 cache delete minikube-local-cache-test:functional-006924                                                                            │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ cache          │ delete registry.k8s.io/pause:3.3                                                                                                                      │ minikube          │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ cache          │ list                                                                                                                                                  │ minikube          │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ ssh            │ functional-006924 ssh sudo crictl images                                                                                                              │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ ssh            │ functional-006924 ssh sudo crictl rmi registry.k8s.io/pause:latest                                                                                    │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ ssh            │ functional-006924 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                               │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │                     │
	│ cache          │ functional-006924 cache reload                                                                                                                        │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ ssh            │ functional-006924 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                               │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ cache          │ delete registry.k8s.io/pause:3.1                                                                                                                      │ minikube          │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ cache          │ delete registry.k8s.io/pause:latest                                                                                                                   │ minikube          │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ kubectl        │ functional-006924 kubectl -- --context functional-006924 get pods                                                                                     │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │                     │
	│ start          │ -p functional-006924 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all                                              │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │                     │
	└────────────────┴───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/19 06:12:12
	Running on machine: ip-172-31-30-239
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1219 06:12:12.743158 2064791 out.go:360] Setting OutFile to fd 1 ...
	I1219 06:12:12.743269 2064791 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 06:12:12.743273 2064791 out.go:374] Setting ErrFile to fd 2...
	I1219 06:12:12.743277 2064791 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 06:12:12.743528 2064791 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22230-1998525/.minikube/bin
	I1219 06:12:12.743902 2064791 out.go:368] Setting JSON to false
	I1219 06:12:12.744837 2064791 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":39279,"bootTime":1766085454,"procs":155,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1219 06:12:12.744896 2064791 start.go:143] virtualization:  
	I1219 06:12:12.748217 2064791 out.go:179] * [functional-006924] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1219 06:12:12.751238 2064791 out.go:179]   - MINIKUBE_LOCATION=22230
	I1219 06:12:12.751295 2064791 notify.go:221] Checking for updates...
	I1219 06:12:12.757153 2064791 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1219 06:12:12.760103 2064791 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22230-1998525/kubeconfig
	I1219 06:12:12.763068 2064791 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22230-1998525/.minikube
	I1219 06:12:12.765948 2064791 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1219 06:12:12.768902 2064791 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1219 06:12:12.772437 2064791 config.go:182] Loaded profile config "functional-006924": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1219 06:12:12.772538 2064791 driver.go:422] Setting default libvirt URI to qemu:///system
	I1219 06:12:12.804424 2064791 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1219 06:12:12.804525 2064791 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1219 06:12:12.859954 2064791 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-12-19 06:12:12.850685523 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1219 06:12:12.860047 2064791 docker.go:319] overlay module found
	I1219 06:12:12.863098 2064791 out.go:179] * Using the docker driver based on existing profile
	I1219 06:12:12.866014 2064791 start.go:309] selected driver: docker
	I1219 06:12:12.866030 2064791 start.go:928] validating driver "docker" against &{Name:functional-006924 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APISer
verName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableC
oreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1219 06:12:12.866122 2064791 start.go:939] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1219 06:12:12.866232 2064791 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1219 06:12:12.920329 2064791 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-12-19 06:12:12.911575892 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1219 06:12:12.920732 2064791 start_flags.go:993] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1219 06:12:12.920793 2064791 cni.go:84] Creating CNI manager for ""
	I1219 06:12:12.920848 2064791 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1219 06:12:12.920889 2064791 start.go:353] cluster config:
	{Name:functional-006924 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Con
tainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCo
reDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1219 06:12:12.924076 2064791 out.go:179] * Starting "functional-006924" primary control-plane node in "functional-006924" cluster
	I1219 06:12:12.926767 2064791 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1219 06:12:12.929823 2064791 out.go:179] * Pulling base image v0.0.48-1765966054-22186 ...
	I1219 06:12:12.932605 2064791 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1219 06:12:12.932642 2064791 preload.go:203] Found local preload: /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4
	I1219 06:12:12.932650 2064791 cache.go:65] Caching tarball of preloaded images
	I1219 06:12:12.932677 2064791 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon
	I1219 06:12:12.932745 2064791 preload.go:238] Found /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4 in cache, skipping download
	I1219 06:12:12.932796 2064791 cache.go:68] Finished verifying existence of preloaded tar for v1.35.0-rc.1 on containerd
	I1219 06:12:12.932911 2064791 profile.go:143] Saving config to /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/config.json ...
	I1219 06:12:12.951789 2064791 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon, skipping pull
	I1219 06:12:12.951800 2064791 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 exists in daemon, skipping load
	I1219 06:12:12.951830 2064791 cache.go:243] Successfully downloaded all kic artifacts
	I1219 06:12:12.951863 2064791 start.go:360] acquireMachinesLock for functional-006924: {Name:mkc84f48e83d18024791d45db780f3ccd746613a Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1219 06:12:12.951927 2064791 start.go:364] duration metric: took 47.033µs to acquireMachinesLock for "functional-006924"
	I1219 06:12:12.951947 2064791 start.go:96] Skipping create...Using existing machine configuration
	I1219 06:12:12.951951 2064791 fix.go:54] fixHost starting: 
	I1219 06:12:12.952210 2064791 cli_runner.go:164] Run: docker container inspect functional-006924 --format={{.State.Status}}
	I1219 06:12:12.969279 2064791 fix.go:112] recreateIfNeeded on functional-006924: state=Running err=<nil>
	W1219 06:12:12.969299 2064791 fix.go:138] unexpected machine state, will restart: <nil>
	I1219 06:12:12.972432 2064791 out.go:252] * Updating the running docker "functional-006924" container ...
	I1219 06:12:12.972457 2064791 machine.go:94] provisionDockerMachine start ...
	I1219 06:12:12.972536 2064791 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:12:12.989705 2064791 main.go:144] libmachine: Using SSH client type: native
	I1219 06:12:12.990045 2064791 main.go:144] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db5e0] 0x3ddae0 <nil>  [] 0s} 127.0.0.1 34704 <nil> <nil>}
	I1219 06:12:12.990052 2064791 main.go:144] libmachine: About to run SSH command:
	hostname
	I1219 06:12:13.144528 2064791 main.go:144] libmachine: SSH cmd err, output: <nil>: functional-006924
	
	I1219 06:12:13.144543 2064791 ubuntu.go:182] provisioning hostname "functional-006924"
	I1219 06:12:13.144626 2064791 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:12:13.163735 2064791 main.go:144] libmachine: Using SSH client type: native
	I1219 06:12:13.164043 2064791 main.go:144] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db5e0] 0x3ddae0 <nil>  [] 0s} 127.0.0.1 34704 <nil> <nil>}
	I1219 06:12:13.164057 2064791 main.go:144] libmachine: About to run SSH command:
	sudo hostname functional-006924 && echo "functional-006924" | sudo tee /etc/hostname
	I1219 06:12:13.331538 2064791 main.go:144] libmachine: SSH cmd err, output: <nil>: functional-006924
	
	I1219 06:12:13.331610 2064791 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:12:13.350490 2064791 main.go:144] libmachine: Using SSH client type: native
	I1219 06:12:13.350800 2064791 main.go:144] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db5e0] 0x3ddae0 <nil>  [] 0s} 127.0.0.1 34704 <nil> <nil>}
	I1219 06:12:13.350813 2064791 main.go:144] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-006924' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-006924/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-006924' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1219 06:12:13.509192 2064791 main.go:144] libmachine: SSH cmd err, output: <nil>: 
	I1219 06:12:13.509210 2064791 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/22230-1998525/.minikube CaCertPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/22230-1998525/.minikube}
	I1219 06:12:13.509245 2064791 ubuntu.go:190] setting up certificates
	I1219 06:12:13.509254 2064791 provision.go:84] configureAuth start
	I1219 06:12:13.509315 2064791 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-006924
	I1219 06:12:13.528067 2064791 provision.go:143] copyHostCerts
	I1219 06:12:13.528151 2064791 exec_runner.go:144] found /home/jenkins/minikube-integration/22230-1998525/.minikube/key.pem, removing ...
	I1219 06:12:13.528164 2064791 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22230-1998525/.minikube/key.pem
	I1219 06:12:13.528239 2064791 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/22230-1998525/.minikube/key.pem (1671 bytes)
	I1219 06:12:13.528339 2064791 exec_runner.go:144] found /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.pem, removing ...
	I1219 06:12:13.528348 2064791 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.pem
	I1219 06:12:13.528375 2064791 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.pem (1078 bytes)
	I1219 06:12:13.528452 2064791 exec_runner.go:144] found /home/jenkins/minikube-integration/22230-1998525/.minikube/cert.pem, removing ...
	I1219 06:12:13.528456 2064791 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22230-1998525/.minikube/cert.pem
	I1219 06:12:13.528480 2064791 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/22230-1998525/.minikube/cert.pem (1123 bytes)
	I1219 06:12:13.528529 2064791 provision.go:117] generating server cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca-key.pem org=jenkins.functional-006924 san=[127.0.0.1 192.168.49.2 functional-006924 localhost minikube]
	I1219 06:12:13.839797 2064791 provision.go:177] copyRemoteCerts
	I1219 06:12:13.839849 2064791 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1219 06:12:13.839888 2064791 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:12:13.857134 2064791 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:12:13.968475 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1219 06:12:13.985747 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1219 06:12:14.005527 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1219 06:12:14.024925 2064791 provision.go:87] duration metric: took 515.64823ms to configureAuth
	I1219 06:12:14.024943 2064791 ubuntu.go:206] setting minikube options for container-runtime
	I1219 06:12:14.025140 2064791 config.go:182] Loaded profile config "functional-006924": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1219 06:12:14.025146 2064791 machine.go:97] duration metric: took 1.052684031s to provisionDockerMachine
	I1219 06:12:14.025152 2064791 start.go:293] postStartSetup for "functional-006924" (driver="docker")
	I1219 06:12:14.025162 2064791 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1219 06:12:14.025218 2064791 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1219 06:12:14.025263 2064791 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:12:14.043178 2064791 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:12:14.148605 2064791 ssh_runner.go:195] Run: cat /etc/os-release
	I1219 06:12:14.151719 2064791 main.go:144] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1219 06:12:14.151753 2064791 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1219 06:12:14.151766 2064791 filesync.go:126] Scanning /home/jenkins/minikube-integration/22230-1998525/.minikube/addons for local assets ...
	I1219 06:12:14.151823 2064791 filesync.go:126] Scanning /home/jenkins/minikube-integration/22230-1998525/.minikube/files for local assets ...
	I1219 06:12:14.151902 2064791 filesync.go:149] local asset: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem -> 20003862.pem in /etc/ssl/certs
	I1219 06:12:14.151975 2064791 filesync.go:149] local asset: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/test/nested/copy/2000386/hosts -> hosts in /etc/test/nested/copy/2000386
	I1219 06:12:14.152026 2064791 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/2000386
	I1219 06:12:14.159336 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem --> /etc/ssl/certs/20003862.pem (1708 bytes)
	I1219 06:12:14.177055 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/test/nested/copy/2000386/hosts --> /etc/test/nested/copy/2000386/hosts (40 bytes)
	I1219 06:12:14.195053 2064791 start.go:296] duration metric: took 169.886807ms for postStartSetup
	I1219 06:12:14.195138 2064791 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1219 06:12:14.195175 2064791 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:12:14.212871 2064791 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:12:14.317767 2064791 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1219 06:12:14.322386 2064791 fix.go:56] duration metric: took 1.37042768s for fixHost
	I1219 06:12:14.322401 2064791 start.go:83] releasing machines lock for "functional-006924", held for 1.370467196s
	I1219 06:12:14.322474 2064791 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-006924
	I1219 06:12:14.339208 2064791 ssh_runner.go:195] Run: cat /version.json
	I1219 06:12:14.339250 2064791 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:12:14.339514 2064791 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1219 06:12:14.339574 2064791 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:12:14.363989 2064791 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:12:14.366009 2064791 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:12:14.468260 2064791 ssh_runner.go:195] Run: systemctl --version
	I1219 06:12:14.559810 2064791 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1219 06:12:14.563901 2064791 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1219 06:12:14.563968 2064791 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1219 06:12:14.571453 2064791 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1219 06:12:14.571466 2064791 start.go:496] detecting cgroup driver to use...
	I1219 06:12:14.571496 2064791 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1219 06:12:14.571541 2064791 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1219 06:12:14.588970 2064791 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1219 06:12:14.603919 2064791 docker.go:218] disabling cri-docker service (if available) ...
	I1219 06:12:14.603971 2064791 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1219 06:12:14.620412 2064791 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1219 06:12:14.634912 2064791 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1219 06:12:14.757018 2064791 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1219 06:12:14.879281 2064791 docker.go:234] disabling docker service ...
	I1219 06:12:14.879341 2064791 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1219 06:12:14.894279 2064791 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1219 06:12:14.907362 2064791 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1219 06:12:15.033676 2064791 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1219 06:12:15.155919 2064791 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1219 06:12:15.169590 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1219 06:12:15.184917 2064791 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1219 06:12:15.194691 2064791 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1219 06:12:15.203742 2064791 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1219 06:12:15.203801 2064791 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1219 06:12:15.212945 2064791 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1219 06:12:15.221903 2064791 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1219 06:12:15.231019 2064791 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1219 06:12:15.239988 2064791 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1219 06:12:15.248292 2064791 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1219 06:12:15.257554 2064791 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1219 06:12:15.266460 2064791 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1219 06:12:15.275351 2064791 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1219 06:12:15.282864 2064791 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1219 06:12:15.290662 2064791 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1219 06:12:15.400462 2064791 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1219 06:12:15.544853 2064791 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1219 06:12:15.544914 2064791 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1219 06:12:15.549076 2064791 start.go:564] Will wait 60s for crictl version
	I1219 06:12:15.549132 2064791 ssh_runner.go:195] Run: which crictl
	I1219 06:12:15.552855 2064791 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1219 06:12:15.578380 2064791 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.2.0
	RuntimeApiVersion:  v1
	I1219 06:12:15.578461 2064791 ssh_runner.go:195] Run: containerd --version
	I1219 06:12:15.600920 2064791 ssh_runner.go:195] Run: containerd --version
	I1219 06:12:15.626436 2064791 out.go:179] * Preparing Kubernetes v1.35.0-rc.1 on containerd 2.2.0 ...
	I1219 06:12:15.629308 2064791 cli_runner.go:164] Run: docker network inspect functional-006924 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1219 06:12:15.645624 2064791 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1219 06:12:15.652379 2064791 out.go:179]   - apiserver.enable-admission-plugins=NamespaceAutoProvision
	I1219 06:12:15.655147 2064791 kubeadm.go:884] updating cluster {Name:functional-006924 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APIServerName:minikubeC
A APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMi
rror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1219 06:12:15.655272 2064791 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1219 06:12:15.655368 2064791 ssh_runner.go:195] Run: sudo crictl images --output json
	I1219 06:12:15.679674 2064791 containerd.go:627] all images are preloaded for containerd runtime.
	I1219 06:12:15.679686 2064791 containerd.go:534] Images already preloaded, skipping extraction
	I1219 06:12:15.679751 2064791 ssh_runner.go:195] Run: sudo crictl images --output json
	I1219 06:12:15.704545 2064791 containerd.go:627] all images are preloaded for containerd runtime.
	I1219 06:12:15.704557 2064791 cache_images.go:86] Images are preloaded, skipping loading
	I1219 06:12:15.704563 2064791 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-rc.1 containerd true true} ...
	I1219 06:12:15.704666 2064791 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-rc.1/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-006924 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1219 06:12:15.704733 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s info
	I1219 06:12:15.729671 2064791 extraconfig.go:125] Overwriting default enable-admission-plugins=NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota with user provided enable-admission-plugins=NamespaceAutoProvision for component apiserver
	I1219 06:12:15.729690 2064791 cni.go:84] Creating CNI manager for ""
	I1219 06:12:15.729697 2064791 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1219 06:12:15.729711 2064791 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1219 06:12:15.729738 2064791 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-rc.1 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-006924 NodeName:functional-006924 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceAutoProvision] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false Kubelet
ConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1219 06:12:15.729853 2064791 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-006924"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceAutoProvision"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-rc.1
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1219 06:12:15.729919 2064791 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-rc.1
	I1219 06:12:15.737786 2064791 binaries.go:51] Found k8s binaries, skipping transfer
	I1219 06:12:15.737845 2064791 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1219 06:12:15.745456 2064791 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (326 bytes)
	I1219 06:12:15.758378 2064791 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (357 bytes)
	I1219 06:12:15.775454 2064791 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2085 bytes)
	I1219 06:12:15.788878 2064791 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1219 06:12:15.792954 2064791 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1219 06:12:15.901526 2064791 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1219 06:12:16.150661 2064791 certs.go:69] Setting up /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924 for IP: 192.168.49.2
	I1219 06:12:16.150673 2064791 certs.go:195] generating shared ca certs ...
	I1219 06:12:16.150687 2064791 certs.go:227] acquiring lock for ca certs: {Name:mk382c71693ea4061363f97b153b21bf6cdf5f38 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1219 06:12:16.150828 2064791 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.key
	I1219 06:12:16.150868 2064791 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/proxy-client-ca.key
	I1219 06:12:16.150873 2064791 certs.go:257] generating profile certs ...
	I1219 06:12:16.150961 2064791 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.key
	I1219 06:12:16.151009 2064791 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.key.febe6fed
	I1219 06:12:16.151048 2064791 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/proxy-client.key
	I1219 06:12:16.151165 2064791 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/2000386.pem (1338 bytes)
	W1219 06:12:16.151195 2064791 certs.go:480] ignoring /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/2000386_empty.pem, impossibly tiny 0 bytes
	I1219 06:12:16.151202 2064791 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca-key.pem (1679 bytes)
	I1219 06:12:16.151230 2064791 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem (1078 bytes)
	I1219 06:12:16.151264 2064791 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/cert.pem (1123 bytes)
	I1219 06:12:16.151286 2064791 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/key.pem (1671 bytes)
	I1219 06:12:16.151329 2064791 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem (1708 bytes)
	I1219 06:12:16.151962 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1219 06:12:16.174202 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1219 06:12:16.194590 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1219 06:12:16.215085 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I1219 06:12:16.232627 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1219 06:12:16.250371 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I1219 06:12:16.267689 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1219 06:12:16.285522 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I1219 06:12:16.302837 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem --> /usr/share/ca-certificates/20003862.pem (1708 bytes)
	I1219 06:12:16.320411 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1219 06:12:16.337922 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/2000386.pem --> /usr/share/ca-certificates/2000386.pem (1338 bytes)
	I1219 06:12:16.355077 2064791 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (722 bytes)
	I1219 06:12:16.368122 2064791 ssh_runner.go:195] Run: openssl version
	I1219 06:12:16.374305 2064791 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:12:16.381720 2064791 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1219 06:12:16.389786 2064791 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:12:16.393456 2064791 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 19 05:43 /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:12:16.393514 2064791 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:12:16.434859 2064791 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1219 06:12:16.442942 2064791 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/2000386.pem
	I1219 06:12:16.450665 2064791 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/2000386.pem /etc/ssl/certs/2000386.pem
	I1219 06:12:16.458612 2064791 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/2000386.pem
	I1219 06:12:16.462545 2064791 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 19 05:57 /usr/share/ca-certificates/2000386.pem
	I1219 06:12:16.462603 2064791 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/2000386.pem
	I1219 06:12:16.503732 2064791 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1219 06:12:16.511394 2064791 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/20003862.pem
	I1219 06:12:16.519328 2064791 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/20003862.pem /etc/ssl/certs/20003862.pem
	I1219 06:12:16.526844 2064791 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/20003862.pem
	I1219 06:12:16.530487 2064791 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 19 05:57 /usr/share/ca-certificates/20003862.pem
	I1219 06:12:16.530547 2064791 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/20003862.pem
	I1219 06:12:16.571532 2064791 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1219 06:12:16.579524 2064791 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1219 06:12:16.583470 2064791 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1219 06:12:16.624483 2064791 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1219 06:12:16.665575 2064791 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1219 06:12:16.707109 2064791 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1219 06:12:16.749520 2064791 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1219 06:12:16.790988 2064791 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1219 06:12:16.831921 2064791 kubeadm.go:401] StartCluster: {Name:functional-006924 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APIServerName:minikubeCA A
PIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirro
r: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1219 06:12:16.832006 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1219 06:12:16.832084 2064791 ssh_runner.go:195] Run: sudo -s eval "crictl --timeout=10s ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1219 06:12:16.857771 2064791 cri.go:92] found id: ""
	I1219 06:12:16.857833 2064791 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1219 06:12:16.866091 2064791 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1219 06:12:16.866101 2064791 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1219 06:12:16.866158 2064791 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1219 06:12:16.873926 2064791 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1219 06:12:16.874482 2064791 kubeconfig.go:125] found "functional-006924" server: "https://192.168.49.2:8441"
	I1219 06:12:16.875731 2064791 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1219 06:12:16.883987 2064791 kubeadm.go:645] detected kubeadm config drift (will reconfigure cluster from new /var/tmp/minikube/kubeadm.yaml):
	-- stdout --
	--- /var/tmp/minikube/kubeadm.yaml	2025-12-19 05:57:41.594715365 +0000
	+++ /var/tmp/minikube/kubeadm.yaml.new	2025-12-19 06:12:15.784216685 +0000
	@@ -24,7 +24,7 @@
	   certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	   extraArgs:
	     - name: "enable-admission-plugins"
	-      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	+      value: "NamespaceAutoProvision"
	 controllerManager:
	   extraArgs:
	     - name: "allocate-node-cidrs"
	
	-- /stdout --
	I1219 06:12:16.884007 2064791 kubeadm.go:1161] stopping kube-system containers ...
	I1219 06:12:16.884018 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name: Namespaces:[kube-system]}
	I1219 06:12:16.884079 2064791 ssh_runner.go:195] Run: sudo -s eval "crictl --timeout=10s ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1219 06:12:16.914439 2064791 cri.go:92] found id: ""
	I1219 06:12:16.914509 2064791 ssh_runner.go:195] Run: sudo systemctl stop kubelet
	I1219 06:12:16.934128 2064791 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1219 06:12:16.942432 2064791 kubeadm.go:158] found existing configuration files:
	-rw------- 1 root root 5631 Dec 19 06:01 /etc/kubernetes/admin.conf
	-rw------- 1 root root 5640 Dec 19 06:01 /etc/kubernetes/controller-manager.conf
	-rw------- 1 root root 5672 Dec 19 06:01 /etc/kubernetes/kubelet.conf
	-rw------- 1 root root 5584 Dec 19 06:01 /etc/kubernetes/scheduler.conf
	
	I1219 06:12:16.942490 2064791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1219 06:12:16.950312 2064791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1219 06:12:16.957901 2064791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1219 06:12:16.957957 2064791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1219 06:12:16.965831 2064791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1219 06:12:16.973975 2064791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1219 06:12:16.974043 2064791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1219 06:12:16.981885 2064791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1219 06:12:16.989698 2064791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1219 06:12:16.989754 2064791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1219 06:12:16.997294 2064791 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1219 06:12:17.007519 2064791 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase certs all --config /var/tmp/minikube/kubeadm.yaml"
	I1219 06:12:17.060607 2064791 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml"
	I1219 06:12:18.829242 2064791 ssh_runner.go:235] Completed: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml": (1.768608779s)
	I1219 06:12:18.829304 2064791 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase kubelet-start --config /var/tmp/minikube/kubeadm.yaml"
	I1219 06:12:19.030093 2064791 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase control-plane all --config /var/tmp/minikube/kubeadm.yaml"
	I1219 06:12:19.096673 2064791 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase etcd local --config /var/tmp/minikube/kubeadm.yaml"
	I1219 06:12:19.143573 2064791 api_server.go:52] waiting for apiserver process to appear ...
	I1219 06:12:19.143640 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:19.643853 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:20.143947 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:20.643846 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:21.143937 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:21.644473 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:22.143865 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:22.643833 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:23.143826 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:23.644236 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:24.144477 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:24.643853 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:25.144064 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:25.643843 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:26.144063 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:26.644478 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:27.144296 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:27.644722 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:28.143844 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:28.643941 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:29.144786 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:29.644723 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:30.143963 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:30.644625 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:31.144751 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:31.643964 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:32.143826 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:32.644605 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:33.144436 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:33.644603 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:34.144557 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:34.643903 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:35.143857 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:35.644797 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:36.144741 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:36.644680 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:37.143872 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:37.643878 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:38.143792 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:38.643830 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:39.143968 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:39.644577 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:40.144282 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:40.643845 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:41.144575 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:41.644658 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:42.144382 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:42.643720 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:43.144137 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:43.644655 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:44.144500 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:44.643786 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:45.143923 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:45.643858 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:46.143983 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:46.644774 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:47.143914 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:47.644188 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:48.144565 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:48.644497 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:49.144557 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:49.644207 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:50.144291 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:50.644010 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:51.144161 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:51.644181 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:52.144353 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:52.644199 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:53.144589 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:53.644816 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:54.143901 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:54.643842 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:55.144726 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:55.643993 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:56.143828 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:56.644199 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:57.144581 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:57.644119 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:58.144258 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:58.643843 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:59.144738 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:59.644593 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:00.143794 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:00.643826 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:01.144533 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:01.644697 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:02.144643 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:02.644679 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:03.143834 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:03.644373 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:04.144520 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:04.643962 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:05.143832 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:05.644579 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:06.143874 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:06.643870 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:07.143982 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:07.644598 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:08.144752 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:08.643796 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:09.143951 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:09.644657 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:10.143751 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:10.643986 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:11.143782 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:11.644460 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:12.144317 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:12.644346 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:13.144670 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:13.643862 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:14.144550 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:14.644576 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:15.144673 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:15.644083 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:16.144204 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:16.644063 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:17.144669 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:17.643808 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:18.144068 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:18.643722 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:19.143899 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:19.143976 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:19.173119 2064791 cri.go:92] found id: ""
	I1219 06:13:19.173133 2064791 logs.go:282] 0 containers: []
	W1219 06:13:19.173141 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:19.173146 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:19.173204 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:19.207794 2064791 cri.go:92] found id: ""
	I1219 06:13:19.207807 2064791 logs.go:282] 0 containers: []
	W1219 06:13:19.207814 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:19.207819 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:19.207884 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:19.237060 2064791 cri.go:92] found id: ""
	I1219 06:13:19.237074 2064791 logs.go:282] 0 containers: []
	W1219 06:13:19.237081 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:19.237092 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:19.237154 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:19.262099 2064791 cri.go:92] found id: ""
	I1219 06:13:19.262114 2064791 logs.go:282] 0 containers: []
	W1219 06:13:19.262121 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:19.262126 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:19.262185 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:19.287540 2064791 cri.go:92] found id: ""
	I1219 06:13:19.287554 2064791 logs.go:282] 0 containers: []
	W1219 06:13:19.287561 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:19.287566 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:19.287632 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:19.315088 2064791 cri.go:92] found id: ""
	I1219 06:13:19.315102 2064791 logs.go:282] 0 containers: []
	W1219 06:13:19.315109 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:19.315115 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:19.315176 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:19.340777 2064791 cri.go:92] found id: ""
	I1219 06:13:19.340791 2064791 logs.go:282] 0 containers: []
	W1219 06:13:19.340798 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:19.340806 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:19.340818 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:19.357916 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:19.357932 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:19.426302 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:19.417866   10752 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:19.418382   10752 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:19.420072   10752 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:19.420408   10752 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:19.421854   10752 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:19.417866   10752 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:19.418382   10752 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:19.420072   10752 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:19.420408   10752 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:19.421854   10752 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:19.426313 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:19.426323 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:19.488347 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:19.488367 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:19.520211 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:19.520229 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:22.084930 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:22.095535 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:22.095602 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:22.122011 2064791 cri.go:92] found id: ""
	I1219 06:13:22.122025 2064791 logs.go:282] 0 containers: []
	W1219 06:13:22.122034 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:22.122059 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:22.122131 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:22.146880 2064791 cri.go:92] found id: ""
	I1219 06:13:22.146893 2064791 logs.go:282] 0 containers: []
	W1219 06:13:22.146900 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:22.146905 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:22.146975 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:22.176007 2064791 cri.go:92] found id: ""
	I1219 06:13:22.176021 2064791 logs.go:282] 0 containers: []
	W1219 06:13:22.176028 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:22.176033 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:22.176095 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:22.211343 2064791 cri.go:92] found id: ""
	I1219 06:13:22.211357 2064791 logs.go:282] 0 containers: []
	W1219 06:13:22.211365 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:22.211370 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:22.211429 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:22.235806 2064791 cri.go:92] found id: ""
	I1219 06:13:22.235829 2064791 logs.go:282] 0 containers: []
	W1219 06:13:22.235836 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:22.235841 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:22.235910 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:22.260858 2064791 cri.go:92] found id: ""
	I1219 06:13:22.260882 2064791 logs.go:282] 0 containers: []
	W1219 06:13:22.260888 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:22.260894 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:22.260954 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:22.285583 2064791 cri.go:92] found id: ""
	I1219 06:13:22.285597 2064791 logs.go:282] 0 containers: []
	W1219 06:13:22.285604 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:22.285613 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:22.285624 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:22.302970 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:22.302988 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:22.371208 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:22.362378   10854 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:22.363321   10854 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:22.365234   10854 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:22.365718   10854 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:22.367191   10854 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:22.362378   10854 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:22.363321   10854 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:22.365234   10854 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:22.365718   10854 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:22.367191   10854 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:22.371227 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:22.371238 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:22.433354 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:22.433373 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:22.468288 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:22.468305 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:25.028097 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:25.038266 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:25.038327 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:25.066109 2064791 cri.go:92] found id: ""
	I1219 06:13:25.066123 2064791 logs.go:282] 0 containers: []
	W1219 06:13:25.066130 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:25.066136 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:25.066199 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:25.091083 2064791 cri.go:92] found id: ""
	I1219 06:13:25.091096 2064791 logs.go:282] 0 containers: []
	W1219 06:13:25.091103 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:25.091109 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:25.091175 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:25.116729 2064791 cri.go:92] found id: ""
	I1219 06:13:25.116743 2064791 logs.go:282] 0 containers: []
	W1219 06:13:25.116750 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:25.116808 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:25.116890 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:25.145471 2064791 cri.go:92] found id: ""
	I1219 06:13:25.145485 2064791 logs.go:282] 0 containers: []
	W1219 06:13:25.145492 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:25.145497 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:25.145555 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:25.173780 2064791 cri.go:92] found id: ""
	I1219 06:13:25.173795 2064791 logs.go:282] 0 containers: []
	W1219 06:13:25.173801 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:25.173807 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:25.173876 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:25.202994 2064791 cri.go:92] found id: ""
	I1219 06:13:25.203008 2064791 logs.go:282] 0 containers: []
	W1219 06:13:25.203015 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:25.203021 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:25.203082 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:25.228548 2064791 cri.go:92] found id: ""
	I1219 06:13:25.228563 2064791 logs.go:282] 0 containers: []
	W1219 06:13:25.228570 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:25.228578 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:25.228590 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:25.260074 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:25.260090 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:25.316293 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:25.316311 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:25.333755 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:25.333771 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:25.395261 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:25.387061   10971 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:25.387481   10971 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:25.389091   10971 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:25.389803   10971 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:25.391447   10971 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:25.387061   10971 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:25.387481   10971 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:25.389091   10971 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:25.389803   10971 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:25.391447   10971 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:25.395273 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:25.395290 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:27.958003 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:27.968507 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:27.968571 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:27.994859 2064791 cri.go:92] found id: ""
	I1219 06:13:27.994872 2064791 logs.go:282] 0 containers: []
	W1219 06:13:27.994879 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:27.994884 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:27.994942 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:28.023716 2064791 cri.go:92] found id: ""
	I1219 06:13:28.023729 2064791 logs.go:282] 0 containers: []
	W1219 06:13:28.023736 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:28.023741 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:28.023807 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:28.048490 2064791 cri.go:92] found id: ""
	I1219 06:13:28.048504 2064791 logs.go:282] 0 containers: []
	W1219 06:13:28.048512 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:28.048517 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:28.048575 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:28.074305 2064791 cri.go:92] found id: ""
	I1219 06:13:28.074319 2064791 logs.go:282] 0 containers: []
	W1219 06:13:28.074326 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:28.074332 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:28.074392 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:28.098924 2064791 cri.go:92] found id: ""
	I1219 06:13:28.098938 2064791 logs.go:282] 0 containers: []
	W1219 06:13:28.098945 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:28.098950 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:28.099021 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:28.123000 2064791 cri.go:92] found id: ""
	I1219 06:13:28.123013 2064791 logs.go:282] 0 containers: []
	W1219 06:13:28.123021 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:28.123026 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:28.123091 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:28.150415 2064791 cri.go:92] found id: ""
	I1219 06:13:28.150428 2064791 logs.go:282] 0 containers: []
	W1219 06:13:28.150435 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:28.150443 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:28.150453 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:28.210763 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:28.210782 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:28.230191 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:28.230208 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:28.294389 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:28.286078   11066 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:28.286664   11066 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:28.288135   11066 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:28.288528   11066 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:28.290322   11066 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:28.286078   11066 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:28.286664   11066 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:28.288135   11066 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:28.288528   11066 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:28.290322   11066 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:28.294400 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:28.294411 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:28.357351 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:28.357371 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:30.888172 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:30.898614 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:30.898676 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:30.926377 2064791 cri.go:92] found id: ""
	I1219 06:13:30.926391 2064791 logs.go:282] 0 containers: []
	W1219 06:13:30.926398 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:30.926403 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:30.926458 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:30.950084 2064791 cri.go:92] found id: ""
	I1219 06:13:30.950097 2064791 logs.go:282] 0 containers: []
	W1219 06:13:30.950111 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:30.950117 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:30.950180 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:30.975713 2064791 cri.go:92] found id: ""
	I1219 06:13:30.975726 2064791 logs.go:282] 0 containers: []
	W1219 06:13:30.975734 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:30.975740 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:30.975798 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:31.012698 2064791 cri.go:92] found id: ""
	I1219 06:13:31.012712 2064791 logs.go:282] 0 containers: []
	W1219 06:13:31.012719 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:31.012725 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:31.012833 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:31.036945 2064791 cri.go:92] found id: ""
	I1219 06:13:31.036958 2064791 logs.go:282] 0 containers: []
	W1219 06:13:31.036965 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:31.036970 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:31.037028 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:31.062431 2064791 cri.go:92] found id: ""
	I1219 06:13:31.062445 2064791 logs.go:282] 0 containers: []
	W1219 06:13:31.062452 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:31.062457 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:31.062538 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:31.088075 2064791 cri.go:92] found id: ""
	I1219 06:13:31.088099 2064791 logs.go:282] 0 containers: []
	W1219 06:13:31.088106 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:31.088114 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:31.088123 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:31.143908 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:31.143928 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:31.164642 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:31.164661 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:31.241367 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:31.232734   11174 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:31.233582   11174 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:31.235302   11174 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:31.235873   11174 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:31.237342   11174 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:31.232734   11174 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:31.233582   11174 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:31.235302   11174 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:31.235873   11174 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:31.237342   11174 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:31.241378 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:31.241388 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:31.304583 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:31.304602 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:33.835874 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:33.847289 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:33.847350 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:33.874497 2064791 cri.go:92] found id: ""
	I1219 06:13:33.874511 2064791 logs.go:282] 0 containers: []
	W1219 06:13:33.874518 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:33.874523 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:33.874602 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:33.899113 2064791 cri.go:92] found id: ""
	I1219 06:13:33.899127 2064791 logs.go:282] 0 containers: []
	W1219 06:13:33.899134 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:33.899139 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:33.899198 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:33.927533 2064791 cri.go:92] found id: ""
	I1219 06:13:33.927546 2064791 logs.go:282] 0 containers: []
	W1219 06:13:33.927553 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:33.927559 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:33.927616 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:33.955150 2064791 cri.go:92] found id: ""
	I1219 06:13:33.955163 2064791 logs.go:282] 0 containers: []
	W1219 06:13:33.955170 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:33.955176 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:33.955233 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:33.979739 2064791 cri.go:92] found id: ""
	I1219 06:13:33.979753 2064791 logs.go:282] 0 containers: []
	W1219 06:13:33.979760 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:33.979765 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:33.979824 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:34.005264 2064791 cri.go:92] found id: ""
	I1219 06:13:34.005283 2064791 logs.go:282] 0 containers: []
	W1219 06:13:34.005291 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:34.005298 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:34.005375 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:34.031917 2064791 cri.go:92] found id: ""
	I1219 06:13:34.031931 2064791 logs.go:282] 0 containers: []
	W1219 06:13:34.031949 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:34.031958 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:34.031968 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:34.098907 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:34.098938 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:34.117494 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:34.117513 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:34.190606 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:34.181776   11272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:34.182594   11272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:34.184322   11272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:34.184963   11272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:34.186654   11272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:34.181776   11272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:34.182594   11272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:34.184322   11272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:34.184963   11272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:34.186654   11272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:34.190617 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:34.190630 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:34.260586 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:34.260607 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:36.792986 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:36.803226 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:36.803292 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:36.830943 2064791 cri.go:92] found id: ""
	I1219 06:13:36.830957 2064791 logs.go:282] 0 containers: []
	W1219 06:13:36.830964 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:36.830970 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:36.831029 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:36.856036 2064791 cri.go:92] found id: ""
	I1219 06:13:36.856051 2064791 logs.go:282] 0 containers: []
	W1219 06:13:36.856058 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:36.856063 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:36.856133 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:36.880807 2064791 cri.go:92] found id: ""
	I1219 06:13:36.880821 2064791 logs.go:282] 0 containers: []
	W1219 06:13:36.880828 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:36.880834 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:36.880893 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:36.904515 2064791 cri.go:92] found id: ""
	I1219 06:13:36.904529 2064791 logs.go:282] 0 containers: []
	W1219 06:13:36.904536 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:36.904542 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:36.904601 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:36.929517 2064791 cri.go:92] found id: ""
	I1219 06:13:36.929530 2064791 logs.go:282] 0 containers: []
	W1219 06:13:36.929538 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:36.929543 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:36.929615 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:36.953623 2064791 cri.go:92] found id: ""
	I1219 06:13:36.953636 2064791 logs.go:282] 0 containers: []
	W1219 06:13:36.953644 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:36.953650 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:36.953706 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:36.978769 2064791 cri.go:92] found id: ""
	I1219 06:13:36.978783 2064791 logs.go:282] 0 containers: []
	W1219 06:13:36.978790 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:36.978797 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:36.978807 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:37.036051 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:37.036072 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:37.053881 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:37.053898 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:37.117512 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:37.109460   11377 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:37.110050   11377 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:37.111554   11377 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:37.112021   11377 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:37.113512   11377 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:37.109460   11377 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:37.110050   11377 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:37.111554   11377 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:37.112021   11377 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:37.113512   11377 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:37.117523 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:37.117532 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:37.185580 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:37.185599 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:39.724185 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:39.735602 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:39.735670 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:39.760200 2064791 cri.go:92] found id: ""
	I1219 06:13:39.760214 2064791 logs.go:282] 0 containers: []
	W1219 06:13:39.760222 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:39.760227 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:39.760286 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:39.787416 2064791 cri.go:92] found id: ""
	I1219 06:13:39.787429 2064791 logs.go:282] 0 containers: []
	W1219 06:13:39.787437 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:39.787442 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:39.787505 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:39.811808 2064791 cri.go:92] found id: ""
	I1219 06:13:39.811822 2064791 logs.go:282] 0 containers: []
	W1219 06:13:39.811830 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:39.811836 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:39.811902 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:39.837072 2064791 cri.go:92] found id: ""
	I1219 06:13:39.837086 2064791 logs.go:282] 0 containers: []
	W1219 06:13:39.837093 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:39.837099 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:39.837200 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:39.866418 2064791 cri.go:92] found id: ""
	I1219 06:13:39.866432 2064791 logs.go:282] 0 containers: []
	W1219 06:13:39.866438 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:39.866444 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:39.866502 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:39.894744 2064791 cri.go:92] found id: ""
	I1219 06:13:39.894758 2064791 logs.go:282] 0 containers: []
	W1219 06:13:39.894765 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:39.894770 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:39.894833 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:39.921608 2064791 cri.go:92] found id: ""
	I1219 06:13:39.921622 2064791 logs.go:282] 0 containers: []
	W1219 06:13:39.921629 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:39.921643 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:39.921654 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:39.985200 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:39.985220 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:40.004064 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:40.004091 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:40.077619 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:40.069025   11483 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:40.069761   11483 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:40.071391   11483 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:40.072228   11483 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:40.073414   11483 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:40.069025   11483 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:40.069761   11483 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:40.071391   11483 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:40.072228   11483 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:40.073414   11483 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:40.077631 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:40.077641 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:40.142102 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:40.142127 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:42.682372 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:42.692608 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:42.692675 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:42.716749 2064791 cri.go:92] found id: ""
	I1219 06:13:42.716796 2064791 logs.go:282] 0 containers: []
	W1219 06:13:42.716804 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:42.716809 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:42.716888 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:42.740973 2064791 cri.go:92] found id: ""
	I1219 06:13:42.740986 2064791 logs.go:282] 0 containers: []
	W1219 06:13:42.740993 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:42.740999 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:42.741064 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:42.765521 2064791 cri.go:92] found id: ""
	I1219 06:13:42.765535 2064791 logs.go:282] 0 containers: []
	W1219 06:13:42.765543 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:42.765548 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:42.765607 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:42.790000 2064791 cri.go:92] found id: ""
	I1219 06:13:42.790015 2064791 logs.go:282] 0 containers: []
	W1219 06:13:42.790034 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:42.790040 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:42.790107 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:42.813722 2064791 cri.go:92] found id: ""
	I1219 06:13:42.813736 2064791 logs.go:282] 0 containers: []
	W1219 06:13:42.813743 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:42.813752 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:42.813814 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:42.838912 2064791 cri.go:92] found id: ""
	I1219 06:13:42.838926 2064791 logs.go:282] 0 containers: []
	W1219 06:13:42.838934 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:42.838939 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:42.839002 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:42.867044 2064791 cri.go:92] found id: ""
	I1219 06:13:42.867058 2064791 logs.go:282] 0 containers: []
	W1219 06:13:42.867065 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:42.867073 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:42.867083 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:42.923612 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:42.923632 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:42.941274 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:42.941293 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:43.008705 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:42.998396   11587 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:42.999004   11587 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:43.000468   11587 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:43.000919   11587 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:43.003777   11587 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:42.998396   11587 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:42.999004   11587 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:43.000468   11587 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:43.000919   11587 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:43.003777   11587 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:43.008716 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:43.008736 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:43.074629 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:43.074654 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:45.608725 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:45.619043 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:45.619107 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:45.645025 2064791 cri.go:92] found id: ""
	I1219 06:13:45.645041 2064791 logs.go:282] 0 containers: []
	W1219 06:13:45.645049 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:45.645054 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:45.645120 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:45.671700 2064791 cri.go:92] found id: ""
	I1219 06:13:45.671716 2064791 logs.go:282] 0 containers: []
	W1219 06:13:45.671723 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:45.671735 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:45.671797 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:45.701839 2064791 cri.go:92] found id: ""
	I1219 06:13:45.701864 2064791 logs.go:282] 0 containers: []
	W1219 06:13:45.701872 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:45.701878 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:45.701947 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:45.731819 2064791 cri.go:92] found id: ""
	I1219 06:13:45.731834 2064791 logs.go:282] 0 containers: []
	W1219 06:13:45.731841 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:45.731847 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:45.731910 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:45.758372 2064791 cri.go:92] found id: ""
	I1219 06:13:45.758386 2064791 logs.go:282] 0 containers: []
	W1219 06:13:45.758393 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:45.758399 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:45.758464 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:45.784713 2064791 cri.go:92] found id: ""
	I1219 06:13:45.784727 2064791 logs.go:282] 0 containers: []
	W1219 06:13:45.784734 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:45.784739 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:45.784829 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:45.811948 2064791 cri.go:92] found id: ""
	I1219 06:13:45.811962 2064791 logs.go:282] 0 containers: []
	W1219 06:13:45.811969 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:45.811977 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:45.811987 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:45.868299 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:45.868317 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:45.886032 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:45.886049 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:45.952733 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:45.944285   11690 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:45.944957   11690 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:45.946453   11690 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:45.946859   11690 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:45.948306   11690 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:45.944285   11690 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:45.944957   11690 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:45.946453   11690 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:45.946859   11690 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:45.948306   11690 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:45.952743 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:45.952783 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:46.020565 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:46.020588 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:48.550865 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:48.561408 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:48.561483 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:48.586776 2064791 cri.go:92] found id: ""
	I1219 06:13:48.586790 2064791 logs.go:282] 0 containers: []
	W1219 06:13:48.586797 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:48.586802 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:48.586864 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:48.612701 2064791 cri.go:92] found id: ""
	I1219 06:13:48.612715 2064791 logs.go:282] 0 containers: []
	W1219 06:13:48.612722 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:48.612727 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:48.612808 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:48.637097 2064791 cri.go:92] found id: ""
	I1219 06:13:48.637110 2064791 logs.go:282] 0 containers: []
	W1219 06:13:48.637118 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:48.637124 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:48.637183 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:48.662701 2064791 cri.go:92] found id: ""
	I1219 06:13:48.662715 2064791 logs.go:282] 0 containers: []
	W1219 06:13:48.662722 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:48.662727 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:48.662785 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:48.690291 2064791 cri.go:92] found id: ""
	I1219 06:13:48.690304 2064791 logs.go:282] 0 containers: []
	W1219 06:13:48.690311 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:48.690316 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:48.690376 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:48.715968 2064791 cri.go:92] found id: ""
	I1219 06:13:48.715983 2064791 logs.go:282] 0 containers: []
	W1219 06:13:48.715990 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:48.715995 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:48.716059 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:48.741069 2064791 cri.go:92] found id: ""
	I1219 06:13:48.741082 2064791 logs.go:282] 0 containers: []
	W1219 06:13:48.741090 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:48.741097 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:48.741113 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:48.796842 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:48.796863 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:48.814146 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:48.814166 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:48.879995 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:48.871133   11794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:48.871771   11794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:48.873597   11794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:48.874180   11794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:48.875997   11794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:48.871133   11794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:48.871771   11794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:48.873597   11794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:48.874180   11794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:48.875997   11794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:48.880005 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:48.880017 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:48.943211 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:48.943231 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:51.472961 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:51.483727 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:51.483805 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:51.513333 2064791 cri.go:92] found id: ""
	I1219 06:13:51.513347 2064791 logs.go:282] 0 containers: []
	W1219 06:13:51.513354 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:51.513360 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:51.513426 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:51.539359 2064791 cri.go:92] found id: ""
	I1219 06:13:51.539373 2064791 logs.go:282] 0 containers: []
	W1219 06:13:51.539380 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:51.539392 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:51.539449 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:51.564730 2064791 cri.go:92] found id: ""
	I1219 06:13:51.564743 2064791 logs.go:282] 0 containers: []
	W1219 06:13:51.564750 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:51.564794 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:51.564855 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:51.590117 2064791 cri.go:92] found id: ""
	I1219 06:13:51.590138 2064791 logs.go:282] 0 containers: []
	W1219 06:13:51.590145 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:51.590150 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:51.590210 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:51.614688 2064791 cri.go:92] found id: ""
	I1219 06:13:51.614702 2064791 logs.go:282] 0 containers: []
	W1219 06:13:51.614709 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:51.614715 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:51.614778 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:51.638492 2064791 cri.go:92] found id: ""
	I1219 06:13:51.638508 2064791 logs.go:282] 0 containers: []
	W1219 06:13:51.638518 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:51.638524 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:51.638597 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:51.666861 2064791 cri.go:92] found id: ""
	I1219 06:13:51.666874 2064791 logs.go:282] 0 containers: []
	W1219 06:13:51.666881 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:51.666888 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:51.666899 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:51.731208 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:51.723294   11895 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:51.723881   11895 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:51.725543   11895 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:51.725958   11895 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:51.727478   11895 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:51.723294   11895 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:51.723881   11895 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:51.725543   11895 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:51.725958   11895 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:51.727478   11895 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:51.731218 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:51.731228 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:51.793354 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:51.793375 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:51.819761 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:51.819784 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:51.877976 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:51.877996 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:54.395396 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:54.405788 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:54.405848 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:54.444121 2064791 cri.go:92] found id: ""
	I1219 06:13:54.444151 2064791 logs.go:282] 0 containers: []
	W1219 06:13:54.444159 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:54.444164 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:54.444243 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:54.471038 2064791 cri.go:92] found id: ""
	I1219 06:13:54.471064 2064791 logs.go:282] 0 containers: []
	W1219 06:13:54.471072 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:54.471077 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:54.471160 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:54.500364 2064791 cri.go:92] found id: ""
	I1219 06:13:54.500377 2064791 logs.go:282] 0 containers: []
	W1219 06:13:54.500385 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:54.500390 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:54.500450 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:54.525919 2064791 cri.go:92] found id: ""
	I1219 06:13:54.525934 2064791 logs.go:282] 0 containers: []
	W1219 06:13:54.525941 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:54.525962 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:54.526021 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:54.551211 2064791 cri.go:92] found id: ""
	I1219 06:13:54.551225 2064791 logs.go:282] 0 containers: []
	W1219 06:13:54.551232 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:54.551239 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:54.551310 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:54.577841 2064791 cri.go:92] found id: ""
	I1219 06:13:54.577854 2064791 logs.go:282] 0 containers: []
	W1219 06:13:54.577861 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:54.577866 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:54.577931 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:54.602636 2064791 cri.go:92] found id: ""
	I1219 06:13:54.602650 2064791 logs.go:282] 0 containers: []
	W1219 06:13:54.602656 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:54.602664 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:54.602675 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:54.619644 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:54.619661 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:54.682901 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:54.674199   12005 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:54.675104   12005 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:54.676718   12005 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:54.677329   12005 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:54.678988   12005 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:54.674199   12005 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:54.675104   12005 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:54.676718   12005 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:54.677329   12005 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:54.678988   12005 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:54.682911 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:54.682921 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:54.749370 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:54.749393 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:54.780731 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:54.780747 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:57.338712 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:57.349237 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:57.349299 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:57.378161 2064791 cri.go:92] found id: ""
	I1219 06:13:57.378175 2064791 logs.go:282] 0 containers: []
	W1219 06:13:57.378181 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:57.378187 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:57.378247 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:57.403073 2064791 cri.go:92] found id: ""
	I1219 06:13:57.403087 2064791 logs.go:282] 0 containers: []
	W1219 06:13:57.403094 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:57.403099 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:57.403160 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:57.431222 2064791 cri.go:92] found id: ""
	I1219 06:13:57.431236 2064791 logs.go:282] 0 containers: []
	W1219 06:13:57.431244 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:57.431249 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:57.431306 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:57.466943 2064791 cri.go:92] found id: ""
	I1219 06:13:57.466957 2064791 logs.go:282] 0 containers: []
	W1219 06:13:57.466964 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:57.466969 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:57.467027 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:57.493181 2064791 cri.go:92] found id: ""
	I1219 06:13:57.493194 2064791 logs.go:282] 0 containers: []
	W1219 06:13:57.493201 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:57.493206 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:57.493265 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:57.517521 2064791 cri.go:92] found id: ""
	I1219 06:13:57.517534 2064791 logs.go:282] 0 containers: []
	W1219 06:13:57.517543 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:57.517549 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:57.517606 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:57.546827 2064791 cri.go:92] found id: ""
	I1219 06:13:57.546841 2064791 logs.go:282] 0 containers: []
	W1219 06:13:57.546848 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:57.546856 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:57.546865 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:57.603521 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:57.603540 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:57.620971 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:57.620988 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:57.687316 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:57.678838   12107 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:57.679404   12107 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:57.680987   12107 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:57.681452   12107 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:57.683006   12107 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:57.678838   12107 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:57.679404   12107 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:57.680987   12107 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:57.681452   12107 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:57.683006   12107 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:57.687326 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:57.687336 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:57.759758 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:57.759787 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:00.293478 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:00.313120 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:00.313205 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:00.349921 2064791 cri.go:92] found id: ""
	I1219 06:14:00.349938 2064791 logs.go:282] 0 containers: []
	W1219 06:14:00.349947 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:00.349953 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:00.350031 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:00.381005 2064791 cri.go:92] found id: ""
	I1219 06:14:00.381022 2064791 logs.go:282] 0 containers: []
	W1219 06:14:00.381031 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:00.381037 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:00.381113 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:00.415179 2064791 cri.go:92] found id: ""
	I1219 06:14:00.415194 2064791 logs.go:282] 0 containers: []
	W1219 06:14:00.415202 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:00.415207 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:00.415268 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:00.455068 2064791 cri.go:92] found id: ""
	I1219 06:14:00.455084 2064791 logs.go:282] 0 containers: []
	W1219 06:14:00.455090 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:00.455096 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:00.455170 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:00.488360 2064791 cri.go:92] found id: ""
	I1219 06:14:00.488374 2064791 logs.go:282] 0 containers: []
	W1219 06:14:00.488382 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:00.488387 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:00.488450 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:00.514399 2064791 cri.go:92] found id: ""
	I1219 06:14:00.514414 2064791 logs.go:282] 0 containers: []
	W1219 06:14:00.514420 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:00.514426 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:00.514485 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:00.544639 2064791 cri.go:92] found id: ""
	I1219 06:14:00.544655 2064791 logs.go:282] 0 containers: []
	W1219 06:14:00.544662 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:00.544670 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:00.544683 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:00.562442 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:00.562459 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:00.630032 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:00.620824   12211 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:00.621603   12211 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:00.623426   12211 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:00.624012   12211 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:00.625740   12211 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:00.620824   12211 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:00.621603   12211 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:00.623426   12211 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:00.624012   12211 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:00.625740   12211 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:00.630043 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:00.630053 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:00.693056 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:00.693075 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:00.724344 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:00.724362 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:03.282407 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:03.292404 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:03.292463 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:03.322285 2064791 cri.go:92] found id: ""
	I1219 06:14:03.322298 2064791 logs.go:282] 0 containers: []
	W1219 06:14:03.322305 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:03.322310 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:03.322377 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:03.345824 2064791 cri.go:92] found id: ""
	I1219 06:14:03.345838 2064791 logs.go:282] 0 containers: []
	W1219 06:14:03.345846 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:03.345852 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:03.345913 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:03.369194 2064791 cri.go:92] found id: ""
	I1219 06:14:03.369208 2064791 logs.go:282] 0 containers: []
	W1219 06:14:03.369214 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:03.369220 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:03.369280 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:03.393453 2064791 cri.go:92] found id: ""
	I1219 06:14:03.393467 2064791 logs.go:282] 0 containers: []
	W1219 06:14:03.393474 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:03.393479 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:03.393538 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:03.423067 2064791 cri.go:92] found id: ""
	I1219 06:14:03.423082 2064791 logs.go:282] 0 containers: []
	W1219 06:14:03.423088 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:03.423093 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:03.423149 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:03.449404 2064791 cri.go:92] found id: ""
	I1219 06:14:03.449418 2064791 logs.go:282] 0 containers: []
	W1219 06:14:03.449424 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:03.449430 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:03.449491 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:03.483320 2064791 cri.go:92] found id: ""
	I1219 06:14:03.483334 2064791 logs.go:282] 0 containers: []
	W1219 06:14:03.483342 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:03.483349 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:03.483360 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:03.546816 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:03.538361   12314 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:03.539133   12314 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:03.540745   12314 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:03.541423   12314 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:03.542995   12314 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:03.538361   12314 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:03.539133   12314 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:03.540745   12314 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:03.541423   12314 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:03.542995   12314 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:03.546828 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:03.546840 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:03.608924 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:03.608943 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:03.640931 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:03.640947 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:03.698583 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:03.698601 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:06.217289 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:06.228468 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:06.228538 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:06.254249 2064791 cri.go:92] found id: ""
	I1219 06:14:06.254264 2064791 logs.go:282] 0 containers: []
	W1219 06:14:06.254271 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:06.254276 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:06.254335 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:06.278663 2064791 cri.go:92] found id: ""
	I1219 06:14:06.278677 2064791 logs.go:282] 0 containers: []
	W1219 06:14:06.278685 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:06.278691 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:06.278751 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:06.304128 2064791 cri.go:92] found id: ""
	I1219 06:14:06.304143 2064791 logs.go:282] 0 containers: []
	W1219 06:14:06.304150 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:06.304162 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:06.304224 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:06.330238 2064791 cri.go:92] found id: ""
	I1219 06:14:06.330252 2064791 logs.go:282] 0 containers: []
	W1219 06:14:06.330259 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:06.330265 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:06.330326 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:06.354219 2064791 cri.go:92] found id: ""
	I1219 06:14:06.354234 2064791 logs.go:282] 0 containers: []
	W1219 06:14:06.354241 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:06.354246 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:06.354307 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:06.382747 2064791 cri.go:92] found id: ""
	I1219 06:14:06.382762 2064791 logs.go:282] 0 containers: []
	W1219 06:14:06.382769 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:06.382777 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:06.382837 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:06.421656 2064791 cri.go:92] found id: ""
	I1219 06:14:06.421670 2064791 logs.go:282] 0 containers: []
	W1219 06:14:06.421677 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:06.421685 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:06.421694 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:06.498836 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:06.498857 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:06.531636 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:06.531653 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:06.590085 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:06.590106 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:06.608226 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:06.608243 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:06.675159 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:06.667629   12435 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:06.668042   12435 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:06.669562   12435 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:06.669908   12435 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:06.671371   12435 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:06.667629   12435 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:06.668042   12435 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:06.669562   12435 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:06.669908   12435 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:06.671371   12435 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:09.176005 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:09.186839 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:09.186916 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:09.211786 2064791 cri.go:92] found id: ""
	I1219 06:14:09.211800 2064791 logs.go:282] 0 containers: []
	W1219 06:14:09.211807 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:09.211812 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:09.211873 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:09.240415 2064791 cri.go:92] found id: ""
	I1219 06:14:09.240429 2064791 logs.go:282] 0 containers: []
	W1219 06:14:09.240436 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:09.240441 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:09.240503 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:09.266183 2064791 cri.go:92] found id: ""
	I1219 06:14:09.266197 2064791 logs.go:282] 0 containers: []
	W1219 06:14:09.266204 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:09.266209 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:09.266269 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:09.294483 2064791 cri.go:92] found id: ""
	I1219 06:14:09.294497 2064791 logs.go:282] 0 containers: []
	W1219 06:14:09.294504 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:09.294509 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:09.294572 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:09.319997 2064791 cri.go:92] found id: ""
	I1219 06:14:09.320011 2064791 logs.go:282] 0 containers: []
	W1219 06:14:09.320019 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:09.320024 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:09.320113 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:09.346661 2064791 cri.go:92] found id: ""
	I1219 06:14:09.346675 2064791 logs.go:282] 0 containers: []
	W1219 06:14:09.346683 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:09.346688 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:09.346746 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:09.371664 2064791 cri.go:92] found id: ""
	I1219 06:14:09.371690 2064791 logs.go:282] 0 containers: []
	W1219 06:14:09.371698 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:09.371706 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:09.371717 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:09.389515 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:09.389534 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:09.473775 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:09.465363   12518 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:09.465921   12518 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:09.467541   12518 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:09.468108   12518 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:09.469733   12518 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:09.465363   12518 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:09.465921   12518 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:09.467541   12518 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:09.468108   12518 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:09.469733   12518 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:09.473785 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:09.473796 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:09.541712 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:09.541736 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:09.577440 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:09.577456 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:12.133722 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:12.144214 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:12.144277 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:12.170929 2064791 cri.go:92] found id: ""
	I1219 06:14:12.170944 2064791 logs.go:282] 0 containers: []
	W1219 06:14:12.170951 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:12.170956 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:12.171026 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:12.195988 2064791 cri.go:92] found id: ""
	I1219 06:14:12.196002 2064791 logs.go:282] 0 containers: []
	W1219 06:14:12.196008 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:12.196014 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:12.196073 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:12.221254 2064791 cri.go:92] found id: ""
	I1219 06:14:12.221269 2064791 logs.go:282] 0 containers: []
	W1219 06:14:12.221276 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:12.221281 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:12.221346 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:12.246403 2064791 cri.go:92] found id: ""
	I1219 06:14:12.246417 2064791 logs.go:282] 0 containers: []
	W1219 06:14:12.246424 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:12.246430 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:12.246491 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:12.271124 2064791 cri.go:92] found id: ""
	I1219 06:14:12.271139 2064791 logs.go:282] 0 containers: []
	W1219 06:14:12.271145 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:12.271150 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:12.271209 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:12.296180 2064791 cri.go:92] found id: ""
	I1219 06:14:12.296194 2064791 logs.go:282] 0 containers: []
	W1219 06:14:12.296211 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:12.296216 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:12.296284 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:12.322520 2064791 cri.go:92] found id: ""
	I1219 06:14:12.322534 2064791 logs.go:282] 0 containers: []
	W1219 06:14:12.322541 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:12.322548 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:12.322559 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:12.349890 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:12.349907 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:12.407189 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:12.407210 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:12.426453 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:12.426469 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:12.499487 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:12.491549   12644 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:12.491978   12644 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:12.493525   12644 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:12.493875   12644 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:12.495402   12644 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:12.491549   12644 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:12.491978   12644 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:12.493525   12644 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:12.493875   12644 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:12.495402   12644 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:12.499498 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:12.499509 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:15.067160 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:15.078543 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:15.078611 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:15.104838 2064791 cri.go:92] found id: ""
	I1219 06:14:15.104852 2064791 logs.go:282] 0 containers: []
	W1219 06:14:15.104860 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:15.104865 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:15.104933 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:15.130179 2064791 cri.go:92] found id: ""
	I1219 06:14:15.130194 2064791 logs.go:282] 0 containers: []
	W1219 06:14:15.130201 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:15.130207 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:15.130268 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:15.156134 2064791 cri.go:92] found id: ""
	I1219 06:14:15.156147 2064791 logs.go:282] 0 containers: []
	W1219 06:14:15.156154 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:15.156159 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:15.156221 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:15.182543 2064791 cri.go:92] found id: ""
	I1219 06:14:15.182557 2064791 logs.go:282] 0 containers: []
	W1219 06:14:15.182564 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:15.182570 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:15.182631 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:15.212350 2064791 cri.go:92] found id: ""
	I1219 06:14:15.212364 2064791 logs.go:282] 0 containers: []
	W1219 06:14:15.212371 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:15.212376 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:15.212437 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:15.239403 2064791 cri.go:92] found id: ""
	I1219 06:14:15.239418 2064791 logs.go:282] 0 containers: []
	W1219 06:14:15.239425 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:15.239430 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:15.239490 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:15.265288 2064791 cri.go:92] found id: ""
	I1219 06:14:15.265303 2064791 logs.go:282] 0 containers: []
	W1219 06:14:15.265310 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:15.265318 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:15.265328 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:15.322825 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:15.322845 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:15.339946 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:15.339963 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:15.406282 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:15.394886   12732 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:15.395459   12732 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:15.397067   12732 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:15.397414   12732 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:15.399914   12732 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:15.394886   12732 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:15.395459   12732 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:15.397067   12732 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:15.397414   12732 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:15.399914   12732 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:15.406294 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:15.406305 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:15.481322 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:15.481342 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:18.011054 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:18.022305 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:18.022367 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:18.048236 2064791 cri.go:92] found id: ""
	I1219 06:14:18.048250 2064791 logs.go:282] 0 containers: []
	W1219 06:14:18.048257 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:18.048262 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:18.048326 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:18.075811 2064791 cri.go:92] found id: ""
	I1219 06:14:18.075825 2064791 logs.go:282] 0 containers: []
	W1219 06:14:18.075833 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:18.075839 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:18.075911 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:18.101578 2064791 cri.go:92] found id: ""
	I1219 06:14:18.101593 2064791 logs.go:282] 0 containers: []
	W1219 06:14:18.101601 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:18.101607 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:18.101668 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:18.127312 2064791 cri.go:92] found id: ""
	I1219 06:14:18.127327 2064791 logs.go:282] 0 containers: []
	W1219 06:14:18.127335 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:18.127341 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:18.127400 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:18.153616 2064791 cri.go:92] found id: ""
	I1219 06:14:18.153630 2064791 logs.go:282] 0 containers: []
	W1219 06:14:18.153637 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:18.153642 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:18.153702 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:18.177937 2064791 cri.go:92] found id: ""
	I1219 06:14:18.177959 2064791 logs.go:282] 0 containers: []
	W1219 06:14:18.177967 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:18.177972 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:18.178044 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:18.211563 2064791 cri.go:92] found id: ""
	I1219 06:14:18.211576 2064791 logs.go:282] 0 containers: []
	W1219 06:14:18.211583 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:18.211591 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:18.211614 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:18.270162 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:18.270182 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:18.288230 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:18.288247 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:18.351713 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:18.343431   12840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:18.343955   12840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:18.345552   12840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:18.346118   12840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:18.347689   12840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:18.343431   12840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:18.343955   12840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:18.345552   12840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:18.346118   12840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:18.347689   12840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:18.351723 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:18.351734 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:18.415359 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:18.415379 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:20.949383 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:20.959444 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:20.959504 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:20.984028 2064791 cri.go:92] found id: ""
	I1219 06:14:20.984041 2064791 logs.go:282] 0 containers: []
	W1219 06:14:20.984048 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:20.984054 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:20.984114 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:21.011129 2064791 cri.go:92] found id: ""
	I1219 06:14:21.011145 2064791 logs.go:282] 0 containers: []
	W1219 06:14:21.011153 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:21.011159 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:21.011232 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:21.036500 2064791 cri.go:92] found id: ""
	I1219 06:14:21.036515 2064791 logs.go:282] 0 containers: []
	W1219 06:14:21.036522 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:21.036528 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:21.036593 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:21.061075 2064791 cri.go:92] found id: ""
	I1219 06:14:21.061092 2064791 logs.go:282] 0 containers: []
	W1219 06:14:21.061099 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:21.061106 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:21.061164 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:21.086516 2064791 cri.go:92] found id: ""
	I1219 06:14:21.086532 2064791 logs.go:282] 0 containers: []
	W1219 06:14:21.086539 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:21.086545 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:21.086606 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:21.110771 2064791 cri.go:92] found id: ""
	I1219 06:14:21.110791 2064791 logs.go:282] 0 containers: []
	W1219 06:14:21.110798 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:21.110804 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:21.110861 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:21.135223 2064791 cri.go:92] found id: ""
	I1219 06:14:21.135237 2064791 logs.go:282] 0 containers: []
	W1219 06:14:21.135244 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:21.135253 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:21.135262 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:21.198022 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:21.198041 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:21.227058 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:21.227074 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:21.285376 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:21.285395 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:21.302978 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:21.302996 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:21.371361 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:21.363586   12960 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:21.364188   12960 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:21.365313   12960 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:21.365885   12960 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:21.367496   12960 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:21.363586   12960 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:21.364188   12960 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:21.365313   12960 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:21.365885   12960 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:21.367496   12960 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:23.871625 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:23.882253 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:23.882315 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:23.911700 2064791 cri.go:92] found id: ""
	I1219 06:14:23.911715 2064791 logs.go:282] 0 containers: []
	W1219 06:14:23.911722 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:23.911727 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:23.911792 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:23.940526 2064791 cri.go:92] found id: ""
	I1219 06:14:23.940542 2064791 logs.go:282] 0 containers: []
	W1219 06:14:23.940549 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:23.940554 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:23.940613 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:23.965505 2064791 cri.go:92] found id: ""
	I1219 06:14:23.965520 2064791 logs.go:282] 0 containers: []
	W1219 06:14:23.965527 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:23.965532 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:23.965592 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:23.990160 2064791 cri.go:92] found id: ""
	I1219 06:14:23.990174 2064791 logs.go:282] 0 containers: []
	W1219 06:14:23.990180 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:23.990186 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:23.990244 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:24.020703 2064791 cri.go:92] found id: ""
	I1219 06:14:24.020718 2064791 logs.go:282] 0 containers: []
	W1219 06:14:24.020731 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:24.020736 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:24.020818 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:24.045597 2064791 cri.go:92] found id: ""
	I1219 06:14:24.045611 2064791 logs.go:282] 0 containers: []
	W1219 06:14:24.045619 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:24.045625 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:24.045687 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:24.070650 2064791 cri.go:92] found id: ""
	I1219 06:14:24.070665 2064791 logs.go:282] 0 containers: []
	W1219 06:14:24.070673 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:24.070681 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:24.070692 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:24.088118 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:24.088135 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:24.154756 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:24.145796   13051 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:24.146235   13051 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:24.147863   13051 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:24.148542   13051 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:24.150277   13051 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:24.145796   13051 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:24.146235   13051 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:24.147863   13051 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:24.148542   13051 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:24.150277   13051 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:24.154766 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:24.154777 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:24.222682 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:24.222712 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:24.251017 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:24.251036 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:26.810547 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:26.821800 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:26.821882 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:26.851618 2064791 cri.go:92] found id: ""
	I1219 06:14:26.851632 2064791 logs.go:282] 0 containers: []
	W1219 06:14:26.851639 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:26.851644 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:26.851701 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:26.881247 2064791 cri.go:92] found id: ""
	I1219 06:14:26.881261 2064791 logs.go:282] 0 containers: []
	W1219 06:14:26.881268 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:26.881273 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:26.881331 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:26.906685 2064791 cri.go:92] found id: ""
	I1219 06:14:26.906698 2064791 logs.go:282] 0 containers: []
	W1219 06:14:26.906705 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:26.906710 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:26.906769 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:26.930800 2064791 cri.go:92] found id: ""
	I1219 06:14:26.930814 2064791 logs.go:282] 0 containers: []
	W1219 06:14:26.930821 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:26.930826 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:26.930886 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:26.955923 2064791 cri.go:92] found id: ""
	I1219 06:14:26.955936 2064791 logs.go:282] 0 containers: []
	W1219 06:14:26.955943 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:26.955949 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:26.956007 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:26.981009 2064791 cri.go:92] found id: ""
	I1219 06:14:26.981023 2064791 logs.go:282] 0 containers: []
	W1219 06:14:26.981030 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:26.981036 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:26.981100 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:27.008093 2064791 cri.go:92] found id: ""
	I1219 06:14:27.008107 2064791 logs.go:282] 0 containers: []
	W1219 06:14:27.008115 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:27.008123 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:27.008133 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:27.064465 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:27.064484 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:27.082027 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:27.082043 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:27.147050 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:27.138327   13160 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:27.139038   13160 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:27.140978   13160 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:27.141575   13160 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:27.143181   13160 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:27.138327   13160 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:27.139038   13160 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:27.140978   13160 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:27.141575   13160 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:27.143181   13160 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:27.147061 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:27.147072 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:27.209843 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:27.209866 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:29.744581 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:29.755392 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:29.755453 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:29.786638 2064791 cri.go:92] found id: ""
	I1219 06:14:29.786652 2064791 logs.go:282] 0 containers: []
	W1219 06:14:29.786659 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:29.786664 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:29.786724 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:29.812212 2064791 cri.go:92] found id: ""
	I1219 06:14:29.812225 2064791 logs.go:282] 0 containers: []
	W1219 06:14:29.812232 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:29.812237 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:29.812296 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:29.836877 2064791 cri.go:92] found id: ""
	I1219 06:14:29.836892 2064791 logs.go:282] 0 containers: []
	W1219 06:14:29.836899 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:29.836905 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:29.836964 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:29.861702 2064791 cri.go:92] found id: ""
	I1219 06:14:29.861715 2064791 logs.go:282] 0 containers: []
	W1219 06:14:29.861722 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:29.861727 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:29.861786 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:29.885680 2064791 cri.go:92] found id: ""
	I1219 06:14:29.885694 2064791 logs.go:282] 0 containers: []
	W1219 06:14:29.885703 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:29.885708 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:29.885770 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:29.910947 2064791 cri.go:92] found id: ""
	I1219 06:14:29.910961 2064791 logs.go:282] 0 containers: []
	W1219 06:14:29.910968 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:29.910973 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:29.911034 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:29.935050 2064791 cri.go:92] found id: ""
	I1219 06:14:29.935065 2064791 logs.go:282] 0 containers: []
	W1219 06:14:29.935072 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:29.935080 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:29.935090 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:29.998135 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:29.998156 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:30.043603 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:30.043622 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:30.105767 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:30.105788 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:30.123694 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:30.123713 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:30.194778 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:30.185852   13278 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:30.186558   13278 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:30.188234   13278 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:30.188924   13278 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:30.190530   13278 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:30.185852   13278 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:30.186558   13278 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:30.188234   13278 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:30.188924   13278 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:30.190530   13278 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:32.694996 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:32.706674 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:32.706732 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:32.732252 2064791 cri.go:92] found id: ""
	I1219 06:14:32.732268 2064791 logs.go:282] 0 containers: []
	W1219 06:14:32.732276 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:32.732282 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:32.732344 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:32.758653 2064791 cri.go:92] found id: ""
	I1219 06:14:32.758667 2064791 logs.go:282] 0 containers: []
	W1219 06:14:32.758674 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:32.758679 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:32.758739 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:32.784000 2064791 cri.go:92] found id: ""
	I1219 06:14:32.784015 2064791 logs.go:282] 0 containers: []
	W1219 06:14:32.784032 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:32.784037 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:32.784104 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:32.812817 2064791 cri.go:92] found id: ""
	I1219 06:14:32.812840 2064791 logs.go:282] 0 containers: []
	W1219 06:14:32.812847 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:32.812856 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:32.812927 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:32.838382 2064791 cri.go:92] found id: ""
	I1219 06:14:32.838396 2064791 logs.go:282] 0 containers: []
	W1219 06:14:32.838404 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:32.838409 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:32.838470 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:32.865911 2064791 cri.go:92] found id: ""
	I1219 06:14:32.865929 2064791 logs.go:282] 0 containers: []
	W1219 06:14:32.865937 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:32.865944 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:32.866010 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:32.890355 2064791 cri.go:92] found id: ""
	I1219 06:14:32.890369 2064791 logs.go:282] 0 containers: []
	W1219 06:14:32.890376 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:32.890384 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:32.890394 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:32.946230 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:32.946249 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:32.964055 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:32.964071 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:33.030318 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:33.021305   13371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:33.022105   13371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:33.023970   13371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:33.024656   13371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:33.026568   13371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:33.021305   13371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:33.022105   13371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:33.023970   13371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:33.024656   13371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:33.026568   13371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:33.030328 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:33.030341 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:33.097167 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:33.097188 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:35.628021 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:35.638217 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:35.638279 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:35.675187 2064791 cri.go:92] found id: ""
	I1219 06:14:35.675209 2064791 logs.go:282] 0 containers: []
	W1219 06:14:35.675217 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:35.675223 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:35.675283 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:35.703303 2064791 cri.go:92] found id: ""
	I1219 06:14:35.703317 2064791 logs.go:282] 0 containers: []
	W1219 06:14:35.703324 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:35.703329 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:35.703387 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:35.736481 2064791 cri.go:92] found id: ""
	I1219 06:14:35.736495 2064791 logs.go:282] 0 containers: []
	W1219 06:14:35.736502 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:35.736507 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:35.736571 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:35.761459 2064791 cri.go:92] found id: ""
	I1219 06:14:35.761472 2064791 logs.go:282] 0 containers: []
	W1219 06:14:35.761479 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:35.761485 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:35.761542 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:35.785228 2064791 cri.go:92] found id: ""
	I1219 06:14:35.785242 2064791 logs.go:282] 0 containers: []
	W1219 06:14:35.785249 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:35.785255 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:35.785317 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:35.811887 2064791 cri.go:92] found id: ""
	I1219 06:14:35.811901 2064791 logs.go:282] 0 containers: []
	W1219 06:14:35.811908 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:35.811913 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:35.811971 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:35.837382 2064791 cri.go:92] found id: ""
	I1219 06:14:35.837395 2064791 logs.go:282] 0 containers: []
	W1219 06:14:35.837402 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:35.837410 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:35.837420 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:35.893642 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:35.893663 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:35.911983 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:35.911999 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:35.979649 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:35.971161   13476 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:35.971848   13476 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:35.973453   13476 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:35.974018   13476 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:35.975638   13476 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:35.971161   13476 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:35.971848   13476 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:35.973453   13476 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:35.974018   13476 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:35.975638   13476 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:35.979659 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:35.979669 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:36.041989 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:36.042008 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:38.571113 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:38.581755 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:38.581829 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:38.606952 2064791 cri.go:92] found id: ""
	I1219 06:14:38.606977 2064791 logs.go:282] 0 containers: []
	W1219 06:14:38.606985 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:38.607000 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:38.607062 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:38.641457 2064791 cri.go:92] found id: ""
	I1219 06:14:38.641470 2064791 logs.go:282] 0 containers: []
	W1219 06:14:38.641477 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:38.641482 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:38.641544 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:38.675510 2064791 cri.go:92] found id: ""
	I1219 06:14:38.675523 2064791 logs.go:282] 0 containers: []
	W1219 06:14:38.675530 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:38.675536 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:38.675597 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:38.701888 2064791 cri.go:92] found id: ""
	I1219 06:14:38.701902 2064791 logs.go:282] 0 containers: []
	W1219 06:14:38.701909 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:38.701915 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:38.701975 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:38.728277 2064791 cri.go:92] found id: ""
	I1219 06:14:38.728290 2064791 logs.go:282] 0 containers: []
	W1219 06:14:38.728299 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:38.728305 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:38.728365 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:38.755404 2064791 cri.go:92] found id: ""
	I1219 06:14:38.755418 2064791 logs.go:282] 0 containers: []
	W1219 06:14:38.755427 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:38.755433 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:38.755495 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:38.778883 2064791 cri.go:92] found id: ""
	I1219 06:14:38.778896 2064791 logs.go:282] 0 containers: []
	W1219 06:14:38.778903 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:38.778911 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:38.778921 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:38.807023 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:38.807039 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:38.867198 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:38.867217 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:38.885283 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:38.885299 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:38.953980 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:38.945374   13594 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:38.946114   13594 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:38.947740   13594 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:38.948287   13594 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:38.949494   13594 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:38.945374   13594 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:38.946114   13594 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:38.947740   13594 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:38.948287   13594 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:38.949494   13594 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:38.953990 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:38.954002 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:41.516935 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:41.527938 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:41.528001 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:41.553242 2064791 cri.go:92] found id: ""
	I1219 06:14:41.553256 2064791 logs.go:282] 0 containers: []
	W1219 06:14:41.553263 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:41.553268 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:41.553333 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:41.579295 2064791 cri.go:92] found id: ""
	I1219 06:14:41.579309 2064791 logs.go:282] 0 containers: []
	W1219 06:14:41.579316 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:41.579321 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:41.579385 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:41.605144 2064791 cri.go:92] found id: ""
	I1219 06:14:41.605157 2064791 logs.go:282] 0 containers: []
	W1219 06:14:41.605164 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:41.605169 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:41.605237 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:41.629732 2064791 cri.go:92] found id: ""
	I1219 06:14:41.629747 2064791 logs.go:282] 0 containers: []
	W1219 06:14:41.629754 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:41.629760 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:41.629822 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:41.659346 2064791 cri.go:92] found id: ""
	I1219 06:14:41.659361 2064791 logs.go:282] 0 containers: []
	W1219 06:14:41.659368 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:41.659373 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:41.659432 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:41.690573 2064791 cri.go:92] found id: ""
	I1219 06:14:41.690598 2064791 logs.go:282] 0 containers: []
	W1219 06:14:41.690606 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:41.690612 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:41.690681 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:41.732984 2064791 cri.go:92] found id: ""
	I1219 06:14:41.732998 2064791 logs.go:282] 0 containers: []
	W1219 06:14:41.733006 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:41.733013 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:41.733023 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:41.795851 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:41.795871 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:41.825041 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:41.825056 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:41.886639 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:41.886659 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:41.904083 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:41.904100 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:41.971851 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:41.963475   13701 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:41.964169   13701 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:41.965774   13701 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:41.966365   13701 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:41.967995   13701 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:41.963475   13701 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:41.964169   13701 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:41.965774   13701 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:41.966365   13701 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:41.967995   13701 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:44.473271 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:44.483164 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:44.483222 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:44.511046 2064791 cri.go:92] found id: ""
	I1219 06:14:44.511060 2064791 logs.go:282] 0 containers: []
	W1219 06:14:44.511067 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:44.511072 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:44.511131 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:44.536197 2064791 cri.go:92] found id: ""
	I1219 06:14:44.536211 2064791 logs.go:282] 0 containers: []
	W1219 06:14:44.536219 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:44.536224 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:44.536283 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:44.562337 2064791 cri.go:92] found id: ""
	I1219 06:14:44.562354 2064791 logs.go:282] 0 containers: []
	W1219 06:14:44.562360 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:44.562366 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:44.562474 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:44.587553 2064791 cri.go:92] found id: ""
	I1219 06:14:44.587567 2064791 logs.go:282] 0 containers: []
	W1219 06:14:44.587574 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:44.587579 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:44.587637 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:44.614987 2064791 cri.go:92] found id: ""
	I1219 06:14:44.615000 2064791 logs.go:282] 0 containers: []
	W1219 06:14:44.615007 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:44.615012 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:44.615070 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:44.638714 2064791 cri.go:92] found id: ""
	I1219 06:14:44.638727 2064791 logs.go:282] 0 containers: []
	W1219 06:14:44.638734 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:44.638740 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:44.638800 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:44.688380 2064791 cri.go:92] found id: ""
	I1219 06:14:44.688393 2064791 logs.go:282] 0 containers: []
	W1219 06:14:44.688401 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:44.688409 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:44.688419 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:44.752969 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:44.752989 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:44.770407 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:44.770424 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:44.837420 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:44.829128   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:44.829667   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:44.831337   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:44.831916   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:44.833576   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:44.829128   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:44.829667   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:44.831337   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:44.831916   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:44.833576   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:44.837430 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:44.837440 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:44.899538 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:44.899557 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:47.426650 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:47.436749 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:47.436827 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:47.461986 2064791 cri.go:92] found id: ""
	I1219 06:14:47.462000 2064791 logs.go:282] 0 containers: []
	W1219 06:14:47.462007 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:47.462012 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:47.462071 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:47.487738 2064791 cri.go:92] found id: ""
	I1219 06:14:47.487765 2064791 logs.go:282] 0 containers: []
	W1219 06:14:47.487785 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:47.487790 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:47.487934 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:47.517657 2064791 cri.go:92] found id: ""
	I1219 06:14:47.517671 2064791 logs.go:282] 0 containers: []
	W1219 06:14:47.517678 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:47.517683 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:47.517741 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:47.541725 2064791 cri.go:92] found id: ""
	I1219 06:14:47.541740 2064791 logs.go:282] 0 containers: []
	W1219 06:14:47.541747 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:47.541752 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:47.541811 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:47.566613 2064791 cri.go:92] found id: ""
	I1219 06:14:47.566627 2064791 logs.go:282] 0 containers: []
	W1219 06:14:47.566634 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:47.566640 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:47.566698 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:47.593670 2064791 cri.go:92] found id: ""
	I1219 06:14:47.593683 2064791 logs.go:282] 0 containers: []
	W1219 06:14:47.593690 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:47.593705 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:47.593778 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:47.617501 2064791 cri.go:92] found id: ""
	I1219 06:14:47.617516 2064791 logs.go:282] 0 containers: []
	W1219 06:14:47.617523 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:47.617530 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:47.617544 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:47.699175 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:47.685609   13884 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:47.686090   13884 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:47.691538   13884 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:47.692419   13884 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:47.694959   13884 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:47.685609   13884 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:47.686090   13884 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:47.691538   13884 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:47.692419   13884 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:47.694959   13884 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:47.699185 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:47.699195 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:47.763955 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:47.763976 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:47.796195 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:47.796212 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:47.855457 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:47.855477 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:50.373913 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:50.384678 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:50.384743 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:50.409292 2064791 cri.go:92] found id: ""
	I1219 06:14:50.409305 2064791 logs.go:282] 0 containers: []
	W1219 06:14:50.409314 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:50.409319 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:50.409380 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:50.434622 2064791 cri.go:92] found id: ""
	I1219 06:14:50.434637 2064791 logs.go:282] 0 containers: []
	W1219 06:14:50.434644 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:50.434649 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:50.434708 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:50.462727 2064791 cri.go:92] found id: ""
	I1219 06:14:50.462741 2064791 logs.go:282] 0 containers: []
	W1219 06:14:50.462748 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:50.462754 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:50.462818 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:50.487565 2064791 cri.go:92] found id: ""
	I1219 06:14:50.487578 2064791 logs.go:282] 0 containers: []
	W1219 06:14:50.487586 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:50.487593 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:50.487655 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:50.514337 2064791 cri.go:92] found id: ""
	I1219 06:14:50.514351 2064791 logs.go:282] 0 containers: []
	W1219 06:14:50.514358 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:50.514363 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:50.514428 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:50.538808 2064791 cri.go:92] found id: ""
	I1219 06:14:50.538822 2064791 logs.go:282] 0 containers: []
	W1219 06:14:50.538829 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:50.538835 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:50.538900 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:50.562833 2064791 cri.go:92] found id: ""
	I1219 06:14:50.562847 2064791 logs.go:282] 0 containers: []
	W1219 06:14:50.562854 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:50.562862 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:50.562872 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:50.630176 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:50.621836   13988 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:50.622705   13988 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:50.624224   13988 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:50.624675   13988 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:50.626153   13988 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:50.621836   13988 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:50.622705   13988 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:50.624224   13988 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:50.624675   13988 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:50.626153   13988 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:50.630187 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:50.630197 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:50.701427 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:50.701449 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:50.729581 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:50.729602 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:50.786455 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:50.786479 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:53.304847 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:53.315504 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:53.315564 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:53.340157 2064791 cri.go:92] found id: ""
	I1219 06:14:53.340172 2064791 logs.go:282] 0 containers: []
	W1219 06:14:53.340179 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:53.340184 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:53.340242 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:53.368950 2064791 cri.go:92] found id: ""
	I1219 06:14:53.368964 2064791 logs.go:282] 0 containers: []
	W1219 06:14:53.368971 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:53.368976 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:53.369037 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:53.393336 2064791 cri.go:92] found id: ""
	I1219 06:14:53.393349 2064791 logs.go:282] 0 containers: []
	W1219 06:14:53.393356 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:53.393362 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:53.393419 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:53.417054 2064791 cri.go:92] found id: ""
	I1219 06:14:53.417069 2064791 logs.go:282] 0 containers: []
	W1219 06:14:53.417085 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:53.417091 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:53.417163 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:53.440932 2064791 cri.go:92] found id: ""
	I1219 06:14:53.440946 2064791 logs.go:282] 0 containers: []
	W1219 06:14:53.440953 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:53.440958 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:53.441016 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:53.464424 2064791 cri.go:92] found id: ""
	I1219 06:14:53.464437 2064791 logs.go:282] 0 containers: []
	W1219 06:14:53.464444 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:53.464449 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:53.464509 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:53.488126 2064791 cri.go:92] found id: ""
	I1219 06:14:53.488143 2064791 logs.go:282] 0 containers: []
	W1219 06:14:53.488150 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:53.488158 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:53.488168 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:53.558644 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:53.550747   14093 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:53.551416   14093 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:53.553060   14093 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:53.553386   14093 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:53.554859   14093 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:53.550747   14093 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:53.551416   14093 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:53.553060   14093 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:53.553386   14093 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:53.554859   14093 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:53.558655 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:53.558665 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:53.622193 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:53.622214 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:53.650744 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:53.650759 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:53.710733 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:53.710750 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:56.228553 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:56.238967 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:56.239030 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:56.262851 2064791 cri.go:92] found id: ""
	I1219 06:14:56.262864 2064791 logs.go:282] 0 containers: []
	W1219 06:14:56.262872 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:56.262877 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:56.262943 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:56.287030 2064791 cri.go:92] found id: ""
	I1219 06:14:56.287043 2064791 logs.go:282] 0 containers: []
	W1219 06:14:56.287050 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:56.287056 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:56.287118 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:56.312417 2064791 cri.go:92] found id: ""
	I1219 06:14:56.312430 2064791 logs.go:282] 0 containers: []
	W1219 06:14:56.312437 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:56.312442 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:56.312505 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:56.350599 2064791 cri.go:92] found id: ""
	I1219 06:14:56.350613 2064791 logs.go:282] 0 containers: []
	W1219 06:14:56.350622 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:56.350627 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:56.350686 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:56.374515 2064791 cri.go:92] found id: ""
	I1219 06:14:56.374528 2064791 logs.go:282] 0 containers: []
	W1219 06:14:56.374535 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:56.374540 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:56.374596 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:56.399267 2064791 cri.go:92] found id: ""
	I1219 06:14:56.399281 2064791 logs.go:282] 0 containers: []
	W1219 06:14:56.399288 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:56.399293 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:56.399351 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:56.424503 2064791 cri.go:92] found id: ""
	I1219 06:14:56.424516 2064791 logs.go:282] 0 containers: []
	W1219 06:14:56.424523 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:56.424531 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:56.424541 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:56.490954 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:56.490973 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:56.522329 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:56.522345 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:56.582279 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:56.582298 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:56.599656 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:56.599673 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:56.665092 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:56.656833   14220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:56.657522   14220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:56.659110   14220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:56.659433   14220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:56.661032   14220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:56.656833   14220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:56.657522   14220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:56.659110   14220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:56.659433   14220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:56.661032   14220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:59.165361 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:59.178705 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:59.178767 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:59.207415 2064791 cri.go:92] found id: ""
	I1219 06:14:59.207429 2064791 logs.go:282] 0 containers: []
	W1219 06:14:59.207436 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:59.207441 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:59.207499 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:59.231912 2064791 cri.go:92] found id: ""
	I1219 06:14:59.231926 2064791 logs.go:282] 0 containers: []
	W1219 06:14:59.231934 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:59.231939 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:59.232000 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:59.258822 2064791 cri.go:92] found id: ""
	I1219 06:14:59.258836 2064791 logs.go:282] 0 containers: []
	W1219 06:14:59.258843 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:59.258848 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:59.258909 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:59.283942 2064791 cri.go:92] found id: ""
	I1219 06:14:59.283955 2064791 logs.go:282] 0 containers: []
	W1219 06:14:59.283963 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:59.283968 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:59.284026 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:59.311236 2064791 cri.go:92] found id: ""
	I1219 06:14:59.311249 2064791 logs.go:282] 0 containers: []
	W1219 06:14:59.311256 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:59.311262 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:59.311322 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:59.336239 2064791 cri.go:92] found id: ""
	I1219 06:14:59.336253 2064791 logs.go:282] 0 containers: []
	W1219 06:14:59.336260 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:59.336267 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:59.336325 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:59.360395 2064791 cri.go:92] found id: ""
	I1219 06:14:59.360409 2064791 logs.go:282] 0 containers: []
	W1219 06:14:59.360417 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:59.360425 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:59.360435 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:59.423580 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:59.423601 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:59.453489 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:59.453506 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:59.512842 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:59.512862 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:59.530149 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:59.530168 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:59.593869 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:59.584731   14326 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:59.585448   14326 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:59.587088   14326 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:59.587675   14326 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:59.589312   14326 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:59.584731   14326 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:59.585448   14326 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:59.587088   14326 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:59.587675   14326 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:59.589312   14326 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:02.094126 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:02.104778 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:02.104839 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:02.129446 2064791 cri.go:92] found id: ""
	I1219 06:15:02.129462 2064791 logs.go:282] 0 containers: []
	W1219 06:15:02.129469 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:02.129474 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:02.129539 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:02.154875 2064791 cri.go:92] found id: ""
	I1219 06:15:02.154889 2064791 logs.go:282] 0 containers: []
	W1219 06:15:02.154896 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:02.154901 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:02.155006 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:02.180628 2064791 cri.go:92] found id: ""
	I1219 06:15:02.180643 2064791 logs.go:282] 0 containers: []
	W1219 06:15:02.180650 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:02.180655 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:02.180716 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:02.205447 2064791 cri.go:92] found id: ""
	I1219 06:15:02.205462 2064791 logs.go:282] 0 containers: []
	W1219 06:15:02.205469 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:02.205475 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:02.205543 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:02.233523 2064791 cri.go:92] found id: ""
	I1219 06:15:02.233537 2064791 logs.go:282] 0 containers: []
	W1219 06:15:02.233544 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:02.233550 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:02.233610 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:02.259723 2064791 cri.go:92] found id: ""
	I1219 06:15:02.259738 2064791 logs.go:282] 0 containers: []
	W1219 06:15:02.259744 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:02.259750 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:02.259813 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:02.289093 2064791 cri.go:92] found id: ""
	I1219 06:15:02.289108 2064791 logs.go:282] 0 containers: []
	W1219 06:15:02.289115 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:02.289123 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:02.289133 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:02.347737 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:02.347758 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:02.365547 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:02.365564 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:02.433606 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:02.424090   14417 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:02.425124   14417 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:02.425822   14417 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:02.427646   14417 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:02.428231   14417 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:02.424090   14417 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:02.425124   14417 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:02.425822   14417 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:02.427646   14417 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:02.428231   14417 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:02.433616 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:02.433627 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:02.497677 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:02.497697 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:05.027685 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:05.037775 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:05.037845 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:05.062132 2064791 cri.go:92] found id: ""
	I1219 06:15:05.062146 2064791 logs.go:282] 0 containers: []
	W1219 06:15:05.062152 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:05.062157 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:05.062230 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:05.087233 2064791 cri.go:92] found id: ""
	I1219 06:15:05.087247 2064791 logs.go:282] 0 containers: []
	W1219 06:15:05.087254 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:05.087259 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:05.087318 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:05.116140 2064791 cri.go:92] found id: ""
	I1219 06:15:05.116155 2064791 logs.go:282] 0 containers: []
	W1219 06:15:05.116162 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:05.116167 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:05.116229 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:05.141158 2064791 cri.go:92] found id: ""
	I1219 06:15:05.141171 2064791 logs.go:282] 0 containers: []
	W1219 06:15:05.141179 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:05.141184 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:05.141255 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:05.166033 2064791 cri.go:92] found id: ""
	I1219 06:15:05.166046 2064791 logs.go:282] 0 containers: []
	W1219 06:15:05.166053 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:05.166059 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:05.166118 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:05.189930 2064791 cri.go:92] found id: ""
	I1219 06:15:05.189943 2064791 logs.go:282] 0 containers: []
	W1219 06:15:05.189951 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:05.189956 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:05.190013 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:05.217697 2064791 cri.go:92] found id: ""
	I1219 06:15:05.217711 2064791 logs.go:282] 0 containers: []
	W1219 06:15:05.217718 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:05.217726 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:05.217737 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:05.273609 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:05.273629 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:05.291274 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:05.291291 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:05.355137 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:05.346587   14523 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:05.347461   14523 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:05.349261   14523 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:05.349738   14523 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:05.351338   14523 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:05.346587   14523 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:05.347461   14523 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:05.349261   14523 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:05.349738   14523 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:05.351338   14523 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:05.355147 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:05.355158 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:05.418376 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:05.418395 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:07.946932 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:07.957404 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:07.957465 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:07.983257 2064791 cri.go:92] found id: ""
	I1219 06:15:07.983270 2064791 logs.go:282] 0 containers: []
	W1219 06:15:07.983277 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:07.983283 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:07.983344 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:08.010747 2064791 cri.go:92] found id: ""
	I1219 06:15:08.010762 2064791 logs.go:282] 0 containers: []
	W1219 06:15:08.010770 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:08.010776 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:08.010842 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:08.040479 2064791 cri.go:92] found id: ""
	I1219 06:15:08.040493 2064791 logs.go:282] 0 containers: []
	W1219 06:15:08.040500 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:08.040506 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:08.040566 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:08.067147 2064791 cri.go:92] found id: ""
	I1219 06:15:08.067162 2064791 logs.go:282] 0 containers: []
	W1219 06:15:08.067169 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:08.067175 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:08.067238 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:08.096399 2064791 cri.go:92] found id: ""
	I1219 06:15:08.096415 2064791 logs.go:282] 0 containers: []
	W1219 06:15:08.096422 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:08.096430 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:08.096492 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:08.120924 2064791 cri.go:92] found id: ""
	I1219 06:15:08.120938 2064791 logs.go:282] 0 containers: []
	W1219 06:15:08.120945 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:08.120951 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:08.121010 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:08.145044 2064791 cri.go:92] found id: ""
	I1219 06:15:08.145057 2064791 logs.go:282] 0 containers: []
	W1219 06:15:08.145064 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:08.145072 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:08.145082 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:08.201643 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:08.201664 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:08.219150 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:08.219166 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:08.285100 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:08.276907   14628 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:08.277509   14628 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:08.279021   14628 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:08.279543   14628 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:08.281080   14628 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:08.276907   14628 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:08.277509   14628 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:08.279021   14628 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:08.279543   14628 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:08.281080   14628 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:08.285118 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:08.285129 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:08.349440 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:08.349460 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:10.878798 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:10.888854 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:10.888917 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:10.920436 2064791 cri.go:92] found id: ""
	I1219 06:15:10.920450 2064791 logs.go:282] 0 containers: []
	W1219 06:15:10.920457 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:10.920463 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:10.920536 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:10.951229 2064791 cri.go:92] found id: ""
	I1219 06:15:10.951243 2064791 logs.go:282] 0 containers: []
	W1219 06:15:10.951252 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:10.951258 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:10.951315 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:10.980039 2064791 cri.go:92] found id: ""
	I1219 06:15:10.980054 2064791 logs.go:282] 0 containers: []
	W1219 06:15:10.980061 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:10.980066 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:10.980126 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:11.008250 2064791 cri.go:92] found id: ""
	I1219 06:15:11.008265 2064791 logs.go:282] 0 containers: []
	W1219 06:15:11.008273 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:11.008278 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:11.008346 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:11.033554 2064791 cri.go:92] found id: ""
	I1219 06:15:11.033568 2064791 logs.go:282] 0 containers: []
	W1219 06:15:11.033575 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:11.033580 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:11.033641 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:11.058115 2064791 cri.go:92] found id: ""
	I1219 06:15:11.058128 2064791 logs.go:282] 0 containers: []
	W1219 06:15:11.058135 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:11.058141 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:11.058219 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:11.083222 2064791 cri.go:92] found id: ""
	I1219 06:15:11.083236 2064791 logs.go:282] 0 containers: []
	W1219 06:15:11.083242 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:11.083250 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:11.083260 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:11.146681 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:11.146702 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:11.176028 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:11.176047 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:11.233340 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:11.233361 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:11.250941 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:11.250957 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:11.315829 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:11.306797   14744 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:11.307411   14744 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:11.309263   14744 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:11.309846   14744 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:11.311388   14744 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:11.306797   14744 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:11.307411   14744 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:11.309263   14744 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:11.309846   14744 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:11.311388   14744 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:13.816114 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:13.826460 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:13.826527 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:13.850958 2064791 cri.go:92] found id: ""
	I1219 06:15:13.850973 2064791 logs.go:282] 0 containers: []
	W1219 06:15:13.850980 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:13.850988 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:13.851048 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:13.879518 2064791 cri.go:92] found id: ""
	I1219 06:15:13.879538 2064791 logs.go:282] 0 containers: []
	W1219 06:15:13.879546 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:13.879551 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:13.879611 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:13.917876 2064791 cri.go:92] found id: ""
	I1219 06:15:13.917890 2064791 logs.go:282] 0 containers: []
	W1219 06:15:13.917897 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:13.917902 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:13.917965 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:13.957039 2064791 cri.go:92] found id: ""
	I1219 06:15:13.957053 2064791 logs.go:282] 0 containers: []
	W1219 06:15:13.957060 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:13.957065 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:13.957126 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:13.992398 2064791 cri.go:92] found id: ""
	I1219 06:15:13.992412 2064791 logs.go:282] 0 containers: []
	W1219 06:15:13.992419 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:13.992424 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:13.992486 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:14.019915 2064791 cri.go:92] found id: ""
	I1219 06:15:14.019930 2064791 logs.go:282] 0 containers: []
	W1219 06:15:14.019938 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:14.019943 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:14.020004 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:14.045800 2064791 cri.go:92] found id: ""
	I1219 06:15:14.045815 2064791 logs.go:282] 0 containers: []
	W1219 06:15:14.045822 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:14.045830 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:14.045841 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:14.102453 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:14.102472 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:14.120093 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:14.120110 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:14.183187 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:14.175289   14833 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:14.175797   14833 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:14.177338   14833 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:14.177777   14833 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:14.179247   14833 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:14.175289   14833 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:14.175797   14833 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:14.177338   14833 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:14.177777   14833 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:14.179247   14833 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:14.183198 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:14.183209 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:14.246652 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:14.246673 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:16.780257 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:16.790741 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:16.790802 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:16.815777 2064791 cri.go:92] found id: ""
	I1219 06:15:16.815802 2064791 logs.go:282] 0 containers: []
	W1219 06:15:16.815809 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:16.815815 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:16.815890 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:16.841105 2064791 cri.go:92] found id: ""
	I1219 06:15:16.841124 2064791 logs.go:282] 0 containers: []
	W1219 06:15:16.841142 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:16.841148 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:16.841217 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:16.866795 2064791 cri.go:92] found id: ""
	I1219 06:15:16.866820 2064791 logs.go:282] 0 containers: []
	W1219 06:15:16.866827 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:16.866833 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:16.866910 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:16.892692 2064791 cri.go:92] found id: ""
	I1219 06:15:16.892706 2064791 logs.go:282] 0 containers: []
	W1219 06:15:16.892713 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:16.892718 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:16.892803 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:16.926258 2064791 cri.go:92] found id: ""
	I1219 06:15:16.926272 2064791 logs.go:282] 0 containers: []
	W1219 06:15:16.926279 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:16.926285 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:16.926346 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:16.955968 2064791 cri.go:92] found id: ""
	I1219 06:15:16.955982 2064791 logs.go:282] 0 containers: []
	W1219 06:15:16.955989 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:16.955995 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:16.956057 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:16.985158 2064791 cri.go:92] found id: ""
	I1219 06:15:16.985172 2064791 logs.go:282] 0 containers: []
	W1219 06:15:16.985179 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:16.985186 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:16.985196 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:17.043879 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:17.043899 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:17.061599 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:17.061616 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:17.125509 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:17.117153   14937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:17.117733   14937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:17.119480   14937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:17.119896   14937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:17.121399   14937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:17.117153   14937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:17.117733   14937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:17.119480   14937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:17.119896   14937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:17.121399   14937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:17.125519 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:17.125531 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:17.189339 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:17.189359 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:19.721517 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:19.731846 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:19.731916 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:19.758133 2064791 cri.go:92] found id: ""
	I1219 06:15:19.758147 2064791 logs.go:282] 0 containers: []
	W1219 06:15:19.758154 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:19.758160 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:19.758228 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:19.787023 2064791 cri.go:92] found id: ""
	I1219 06:15:19.787037 2064791 logs.go:282] 0 containers: []
	W1219 06:15:19.787045 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:19.787059 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:19.787123 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:19.813855 2064791 cri.go:92] found id: ""
	I1219 06:15:19.813869 2064791 logs.go:282] 0 containers: []
	W1219 06:15:19.813876 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:19.813881 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:19.813944 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:19.838418 2064791 cri.go:92] found id: ""
	I1219 06:15:19.838432 2064791 logs.go:282] 0 containers: []
	W1219 06:15:19.838439 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:19.838444 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:19.838508 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:19.863215 2064791 cri.go:92] found id: ""
	I1219 06:15:19.863229 2064791 logs.go:282] 0 containers: []
	W1219 06:15:19.863240 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:19.863246 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:19.863307 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:19.887732 2064791 cri.go:92] found id: ""
	I1219 06:15:19.887746 2064791 logs.go:282] 0 containers: []
	W1219 06:15:19.887753 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:19.887758 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:19.887815 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:19.930174 2064791 cri.go:92] found id: ""
	I1219 06:15:19.930192 2064791 logs.go:282] 0 containers: []
	W1219 06:15:19.930200 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:19.930208 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:19.930222 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:19.949025 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:19.949041 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:20.022932 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:20.013526   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:20.014350   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:20.016121   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:20.016702   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:20.018424   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:20.013526   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:20.014350   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:20.016121   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:20.016702   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:20.018424   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:20.022944 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:20.022955 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:20.088903 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:20.088924 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:20.117778 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:20.117794 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:22.677536 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:22.687468 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:22.687536 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:22.712714 2064791 cri.go:92] found id: ""
	I1219 06:15:22.712728 2064791 logs.go:282] 0 containers: []
	W1219 06:15:22.712736 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:22.712741 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:22.712816 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:22.736316 2064791 cri.go:92] found id: ""
	I1219 06:15:22.736329 2064791 logs.go:282] 0 containers: []
	W1219 06:15:22.736336 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:22.736341 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:22.736401 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:22.762215 2064791 cri.go:92] found id: ""
	I1219 06:15:22.762229 2064791 logs.go:282] 0 containers: []
	W1219 06:15:22.762236 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:22.762241 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:22.762309 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:22.787061 2064791 cri.go:92] found id: ""
	I1219 06:15:22.787074 2064791 logs.go:282] 0 containers: []
	W1219 06:15:22.787081 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:22.787086 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:22.787146 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:22.814937 2064791 cri.go:92] found id: ""
	I1219 06:15:22.814951 2064791 logs.go:282] 0 containers: []
	W1219 06:15:22.814957 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:22.814963 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:22.815033 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:22.842839 2064791 cri.go:92] found id: ""
	I1219 06:15:22.842853 2064791 logs.go:282] 0 containers: []
	W1219 06:15:22.842859 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:22.842865 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:22.842923 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:22.869394 2064791 cri.go:92] found id: ""
	I1219 06:15:22.869407 2064791 logs.go:282] 0 containers: []
	W1219 06:15:22.869413 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:22.869421 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:22.869430 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:22.926492 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:22.926510 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:22.944210 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:22.944232 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:23.013797 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:23.003493   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:23.005070   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:23.006087   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:23.007846   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:23.008447   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:23.003493   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:23.005070   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:23.006087   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:23.007846   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:23.008447   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:23.013807 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:23.013821 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:23.081279 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:23.081306 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:25.612946 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:25.622887 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:25.622947 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:25.656332 2064791 cri.go:92] found id: ""
	I1219 06:15:25.656346 2064791 logs.go:282] 0 containers: []
	W1219 06:15:25.656353 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:25.656359 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:25.656425 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:25.680887 2064791 cri.go:92] found id: ""
	I1219 06:15:25.680901 2064791 logs.go:282] 0 containers: []
	W1219 06:15:25.680908 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:25.680913 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:25.680981 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:25.705508 2064791 cri.go:92] found id: ""
	I1219 06:15:25.705523 2064791 logs.go:282] 0 containers: []
	W1219 06:15:25.705531 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:25.705536 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:25.705598 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:25.729434 2064791 cri.go:92] found id: ""
	I1219 06:15:25.729447 2064791 logs.go:282] 0 containers: []
	W1219 06:15:25.729454 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:25.729459 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:25.729517 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:25.755351 2064791 cri.go:92] found id: ""
	I1219 06:15:25.755365 2064791 logs.go:282] 0 containers: []
	W1219 06:15:25.755381 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:25.755388 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:25.755449 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:25.782840 2064791 cri.go:92] found id: ""
	I1219 06:15:25.782854 2064791 logs.go:282] 0 containers: []
	W1219 06:15:25.782861 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:25.782866 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:25.782929 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:25.811125 2064791 cri.go:92] found id: ""
	I1219 06:15:25.811139 2064791 logs.go:282] 0 containers: []
	W1219 06:15:25.811155 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:25.811165 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:25.811175 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:25.867579 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:25.867601 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:25.884977 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:25.884996 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:25.983099 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:25.974919   15250 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:25.975374   15250 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:25.977076   15250 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:25.977555   15250 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:25.979165   15250 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:25.974919   15250 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:25.975374   15250 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:25.977076   15250 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:25.977555   15250 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:25.979165   15250 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:25.983110 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:25.983119 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:26.047515 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:26.047534 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:28.576468 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:28.586983 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:28.587044 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:28.612243 2064791 cri.go:92] found id: ""
	I1219 06:15:28.612257 2064791 logs.go:282] 0 containers: []
	W1219 06:15:28.612264 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:28.612270 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:28.612331 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:28.637476 2064791 cri.go:92] found id: ""
	I1219 06:15:28.637490 2064791 logs.go:282] 0 containers: []
	W1219 06:15:28.637496 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:28.637502 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:28.637564 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:28.662778 2064791 cri.go:92] found id: ""
	I1219 06:15:28.662792 2064791 logs.go:282] 0 containers: []
	W1219 06:15:28.662800 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:28.662805 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:28.662864 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:28.687078 2064791 cri.go:92] found id: ""
	I1219 06:15:28.687091 2064791 logs.go:282] 0 containers: []
	W1219 06:15:28.687098 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:28.687105 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:28.687166 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:28.712552 2064791 cri.go:92] found id: ""
	I1219 06:15:28.712566 2064791 logs.go:282] 0 containers: []
	W1219 06:15:28.712572 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:28.712577 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:28.712646 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:28.738798 2064791 cri.go:92] found id: ""
	I1219 06:15:28.738812 2064791 logs.go:282] 0 containers: []
	W1219 06:15:28.738819 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:28.738824 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:28.738881 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:28.767309 2064791 cri.go:92] found id: ""
	I1219 06:15:28.767324 2064791 logs.go:282] 0 containers: []
	W1219 06:15:28.767340 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:28.767349 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:28.767358 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:28.827489 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:28.827509 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:28.844978 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:28.844994 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:28.915425 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:28.906948   15355 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:28.907778   15355 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:28.909411   15355 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:28.909881   15355 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:28.911514   15355 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:28.906948   15355 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:28.907778   15355 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:28.909411   15355 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:28.909881   15355 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:28.911514   15355 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:28.915435 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:28.915445 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:28.980721 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:28.980742 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:31.518692 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:31.528660 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:31.528719 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:31.551685 2064791 cri.go:92] found id: ""
	I1219 06:15:31.551699 2064791 logs.go:282] 0 containers: []
	W1219 06:15:31.551706 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:31.551711 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:31.551772 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:31.578616 2064791 cri.go:92] found id: ""
	I1219 06:15:31.578631 2064791 logs.go:282] 0 containers: []
	W1219 06:15:31.578637 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:31.578643 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:31.578703 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:31.602562 2064791 cri.go:92] found id: ""
	I1219 06:15:31.602576 2064791 logs.go:282] 0 containers: []
	W1219 06:15:31.602582 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:31.602588 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:31.602646 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:31.626697 2064791 cri.go:92] found id: ""
	I1219 06:15:31.626711 2064791 logs.go:282] 0 containers: []
	W1219 06:15:31.626718 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:31.626723 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:31.626786 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:31.650705 2064791 cri.go:92] found id: ""
	I1219 06:15:31.650718 2064791 logs.go:282] 0 containers: []
	W1219 06:15:31.650725 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:31.650730 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:31.650791 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:31.675292 2064791 cri.go:92] found id: ""
	I1219 06:15:31.675305 2064791 logs.go:282] 0 containers: []
	W1219 06:15:31.675312 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:31.675318 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:31.675390 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:31.699969 2064791 cri.go:92] found id: ""
	I1219 06:15:31.699993 2064791 logs.go:282] 0 containers: []
	W1219 06:15:31.700000 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:31.700008 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:31.700018 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:31.765728 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:31.765750 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:31.793450 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:31.793466 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:31.849244 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:31.849262 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:31.866467 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:31.866483 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:31.960156 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:31.933314   15472 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:31.948921   15472 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:31.949480   15472 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:31.951697   15472 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:31.951968   15472 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:31.933314   15472 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:31.948921   15472 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:31.949480   15472 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:31.951697   15472 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:31.951968   15472 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:34.460923 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:34.473072 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:34.473134 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:34.498075 2064791 cri.go:92] found id: ""
	I1219 06:15:34.498089 2064791 logs.go:282] 0 containers: []
	W1219 06:15:34.498097 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:34.498103 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:34.498162 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:34.522785 2064791 cri.go:92] found id: ""
	I1219 06:15:34.522800 2064791 logs.go:282] 0 containers: []
	W1219 06:15:34.522807 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:34.522812 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:34.522871 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:34.550566 2064791 cri.go:92] found id: ""
	I1219 06:15:34.550580 2064791 logs.go:282] 0 containers: []
	W1219 06:15:34.550587 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:34.550592 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:34.550651 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:34.579586 2064791 cri.go:92] found id: ""
	I1219 06:15:34.579600 2064791 logs.go:282] 0 containers: []
	W1219 06:15:34.579607 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:34.579612 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:34.579670 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:34.606248 2064791 cri.go:92] found id: ""
	I1219 06:15:34.606261 2064791 logs.go:282] 0 containers: []
	W1219 06:15:34.606269 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:34.606274 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:34.606335 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:34.634419 2064791 cri.go:92] found id: ""
	I1219 06:15:34.634433 2064791 logs.go:282] 0 containers: []
	W1219 06:15:34.634440 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:34.634446 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:34.634509 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:34.658438 2064791 cri.go:92] found id: ""
	I1219 06:15:34.658451 2064791 logs.go:282] 0 containers: []
	W1219 06:15:34.658458 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:34.658465 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:34.658475 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:34.675933 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:34.675950 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:34.740273 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:34.732883   15564 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:34.733297   15564 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:34.734737   15564 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:34.735051   15564 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:34.736480   15564 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:34.732883   15564 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:34.733297   15564 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:34.734737   15564 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:34.735051   15564 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:34.736480   15564 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:34.740283 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:34.740293 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:34.802357 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:34.802378 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:34.833735 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:34.833751 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:37.390170 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:37.400300 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:37.400358 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:37.425095 2064791 cri.go:92] found id: ""
	I1219 06:15:37.425110 2064791 logs.go:282] 0 containers: []
	W1219 06:15:37.425117 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:37.425122 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:37.425178 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:37.451178 2064791 cri.go:92] found id: ""
	I1219 06:15:37.451192 2064791 logs.go:282] 0 containers: []
	W1219 06:15:37.451199 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:37.451205 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:37.451273 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:37.475828 2064791 cri.go:92] found id: ""
	I1219 06:15:37.475842 2064791 logs.go:282] 0 containers: []
	W1219 06:15:37.475848 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:37.475854 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:37.475911 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:37.499474 2064791 cri.go:92] found id: ""
	I1219 06:15:37.499488 2064791 logs.go:282] 0 containers: []
	W1219 06:15:37.499494 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:37.499500 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:37.499563 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:37.523636 2064791 cri.go:92] found id: ""
	I1219 06:15:37.523649 2064791 logs.go:282] 0 containers: []
	W1219 06:15:37.523656 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:37.523662 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:37.523720 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:37.547846 2064791 cri.go:92] found id: ""
	I1219 06:15:37.547859 2064791 logs.go:282] 0 containers: []
	W1219 06:15:37.547868 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:37.547873 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:37.547929 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:37.574766 2064791 cri.go:92] found id: ""
	I1219 06:15:37.574780 2064791 logs.go:282] 0 containers: []
	W1219 06:15:37.574787 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:37.574795 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:37.574805 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:37.601905 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:37.601923 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:37.657564 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:37.657584 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:37.674777 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:37.674793 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:37.736918 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:37.728853   15679 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:37.729594   15679 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:37.731160   15679 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:37.731519   15679 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:37.733094   15679 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:37.728853   15679 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:37.729594   15679 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:37.731160   15679 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:37.731519   15679 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:37.733094   15679 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:37.736928 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:37.736939 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:40.303769 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:40.313854 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:40.313919 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:40.338505 2064791 cri.go:92] found id: ""
	I1219 06:15:40.338519 2064791 logs.go:282] 0 containers: []
	W1219 06:15:40.338527 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:40.338532 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:40.338594 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:40.363391 2064791 cri.go:92] found id: ""
	I1219 06:15:40.363405 2064791 logs.go:282] 0 containers: []
	W1219 06:15:40.363412 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:40.363417 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:40.363476 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:40.389092 2064791 cri.go:92] found id: ""
	I1219 06:15:40.389105 2064791 logs.go:282] 0 containers: []
	W1219 06:15:40.389113 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:40.389118 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:40.389184 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:40.412993 2064791 cri.go:92] found id: ""
	I1219 06:15:40.413007 2064791 logs.go:282] 0 containers: []
	W1219 06:15:40.413014 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:40.413022 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:40.413087 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:40.438530 2064791 cri.go:92] found id: ""
	I1219 06:15:40.438544 2064791 logs.go:282] 0 containers: []
	W1219 06:15:40.438550 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:40.438556 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:40.438617 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:40.462221 2064791 cri.go:92] found id: ""
	I1219 06:15:40.462235 2064791 logs.go:282] 0 containers: []
	W1219 06:15:40.462242 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:40.462248 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:40.462310 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:40.487125 2064791 cri.go:92] found id: ""
	I1219 06:15:40.487139 2064791 logs.go:282] 0 containers: []
	W1219 06:15:40.487146 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:40.487155 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:40.487165 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:40.543163 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:40.543184 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:40.560362 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:40.560379 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:40.627130 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:40.619309   15773 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:40.619937   15773 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:40.621423   15773 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:40.621900   15773 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:40.623348   15773 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:40.619309   15773 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:40.619937   15773 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:40.621423   15773 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:40.621900   15773 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:40.623348   15773 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:40.627139 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:40.627149 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:40.689654 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:40.689673 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:43.219338 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:43.229544 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:43.229607 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:43.253914 2064791 cri.go:92] found id: ""
	I1219 06:15:43.253935 2064791 logs.go:282] 0 containers: []
	W1219 06:15:43.253941 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:43.253947 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:43.254007 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:43.279019 2064791 cri.go:92] found id: ""
	I1219 06:15:43.279033 2064791 logs.go:282] 0 containers: []
	W1219 06:15:43.279040 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:43.279045 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:43.279106 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:43.304187 2064791 cri.go:92] found id: ""
	I1219 06:15:43.304202 2064791 logs.go:282] 0 containers: []
	W1219 06:15:43.304209 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:43.304216 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:43.304275 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:43.327938 2064791 cri.go:92] found id: ""
	I1219 06:15:43.327951 2064791 logs.go:282] 0 containers: []
	W1219 06:15:43.327958 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:43.327963 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:43.328027 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:43.356864 2064791 cri.go:92] found id: ""
	I1219 06:15:43.356878 2064791 logs.go:282] 0 containers: []
	W1219 06:15:43.356885 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:43.356891 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:43.356958 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:43.381050 2064791 cri.go:92] found id: ""
	I1219 06:15:43.381063 2064791 logs.go:282] 0 containers: []
	W1219 06:15:43.381070 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:43.381076 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:43.381138 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:43.404804 2064791 cri.go:92] found id: ""
	I1219 06:15:43.404818 2064791 logs.go:282] 0 containers: []
	W1219 06:15:43.404825 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:43.404832 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:43.404857 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:43.470026 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:43.461361   15871 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:43.461922   15871 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:43.463417   15871 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:43.464514   15871 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:43.465204   15871 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:43.461361   15871 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:43.461922   15871 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:43.463417   15871 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:43.464514   15871 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:43.465204   15871 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:43.470036 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:43.470050 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:43.533067 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:43.533086 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:43.560074 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:43.560097 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:43.618564 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:43.618582 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:46.135866 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:46.146429 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:46.146493 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:46.180564 2064791 cri.go:92] found id: ""
	I1219 06:15:46.180578 2064791 logs.go:282] 0 containers: []
	W1219 06:15:46.180595 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:46.180601 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:46.180669 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:46.208067 2064791 cri.go:92] found id: ""
	I1219 06:15:46.208081 2064791 logs.go:282] 0 containers: []
	W1219 06:15:46.208087 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:46.208100 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:46.208159 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:46.234676 2064791 cri.go:92] found id: ""
	I1219 06:15:46.234692 2064791 logs.go:282] 0 containers: []
	W1219 06:15:46.234703 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:46.234709 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:46.234775 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:46.259673 2064791 cri.go:92] found id: ""
	I1219 06:15:46.259686 2064791 logs.go:282] 0 containers: []
	W1219 06:15:46.259693 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:46.259707 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:46.259765 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:46.286964 2064791 cri.go:92] found id: ""
	I1219 06:15:46.286979 2064791 logs.go:282] 0 containers: []
	W1219 06:15:46.286986 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:46.286992 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:46.287056 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:46.312785 2064791 cri.go:92] found id: ""
	I1219 06:15:46.312800 2064791 logs.go:282] 0 containers: []
	W1219 06:15:46.312807 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:46.312813 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:46.312875 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:46.339250 2064791 cri.go:92] found id: ""
	I1219 06:15:46.339264 2064791 logs.go:282] 0 containers: []
	W1219 06:15:46.339271 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:46.339279 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:46.339290 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:46.368113 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:46.368129 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:46.423008 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:46.423029 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:46.440481 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:46.440503 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:46.504270 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:46.496670   15994 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:46.497191   15994 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:46.498736   15994 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:46.499181   15994 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:46.500593   15994 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:46.496670   15994 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:46.497191   15994 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:46.498736   15994 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:46.499181   15994 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:46.500593   15994 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:46.504280 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:46.504291 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:49.065736 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:49.075993 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:49.076057 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:49.102714 2064791 cri.go:92] found id: ""
	I1219 06:15:49.102729 2064791 logs.go:282] 0 containers: []
	W1219 06:15:49.102736 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:49.102741 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:49.102808 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:49.131284 2064791 cri.go:92] found id: ""
	I1219 06:15:49.131297 2064791 logs.go:282] 0 containers: []
	W1219 06:15:49.131323 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:49.131328 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:49.131398 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:49.166942 2064791 cri.go:92] found id: ""
	I1219 06:15:49.166955 2064791 logs.go:282] 0 containers: []
	W1219 06:15:49.166962 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:49.166968 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:49.167036 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:49.204412 2064791 cri.go:92] found id: ""
	I1219 06:15:49.204425 2064791 logs.go:282] 0 containers: []
	W1219 06:15:49.204444 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:49.204450 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:49.204522 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:49.232351 2064791 cri.go:92] found id: ""
	I1219 06:15:49.232364 2064791 logs.go:282] 0 containers: []
	W1219 06:15:49.232371 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:49.232377 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:49.232434 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:49.257013 2064791 cri.go:92] found id: ""
	I1219 06:15:49.257028 2064791 logs.go:282] 0 containers: []
	W1219 06:15:49.257046 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:49.257052 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:49.257112 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:49.282354 2064791 cri.go:92] found id: ""
	I1219 06:15:49.282368 2064791 logs.go:282] 0 containers: []
	W1219 06:15:49.282375 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:49.282384 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:49.282396 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:49.351742 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:49.342325   16079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:49.343272   16079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:49.345051   16079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:49.345596   16079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:49.347231   16079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:49.342325   16079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:49.343272   16079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:49.345051   16079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:49.345596   16079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:49.347231   16079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:49.351753 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:49.351764 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:49.416971 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:49.416991 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:49.445804 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:49.445819 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:49.503988 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:49.504006 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:52.023309 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:52.034750 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:52.034819 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:52.061000 2064791 cri.go:92] found id: ""
	I1219 06:15:52.061014 2064791 logs.go:282] 0 containers: []
	W1219 06:15:52.061021 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:52.061026 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:52.061084 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:52.086949 2064791 cri.go:92] found id: ""
	I1219 06:15:52.086964 2064791 logs.go:282] 0 containers: []
	W1219 06:15:52.086971 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:52.086977 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:52.087048 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:52.112534 2064791 cri.go:92] found id: ""
	I1219 06:15:52.112549 2064791 logs.go:282] 0 containers: []
	W1219 06:15:52.112556 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:52.112562 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:52.112635 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:52.137132 2064791 cri.go:92] found id: ""
	I1219 06:15:52.137146 2064791 logs.go:282] 0 containers: []
	W1219 06:15:52.137154 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:52.137160 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:52.137221 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:52.191157 2064791 cri.go:92] found id: ""
	I1219 06:15:52.191171 2064791 logs.go:282] 0 containers: []
	W1219 06:15:52.191178 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:52.191184 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:52.191245 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:52.220921 2064791 cri.go:92] found id: ""
	I1219 06:15:52.220936 2064791 logs.go:282] 0 containers: []
	W1219 06:15:52.220942 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:52.220948 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:52.221009 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:52.250645 2064791 cri.go:92] found id: ""
	I1219 06:15:52.250658 2064791 logs.go:282] 0 containers: []
	W1219 06:15:52.250665 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:52.250673 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:52.250684 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:52.306199 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:52.306222 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:52.323553 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:52.323570 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:52.386634 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:52.378311   16188 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:52.379023   16188 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:52.380829   16188 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:52.381330   16188 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:52.382864   16188 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:52.378311   16188 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:52.379023   16188 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:52.380829   16188 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:52.381330   16188 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:52.382864   16188 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:52.386643 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:52.386653 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:52.450135 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:52.450155 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:54.981347 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:54.991806 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:54.991864 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:55.028687 2064791 cri.go:92] found id: ""
	I1219 06:15:55.028702 2064791 logs.go:282] 0 containers: []
	W1219 06:15:55.028709 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:55.028714 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:55.028797 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:55.053716 2064791 cri.go:92] found id: ""
	I1219 06:15:55.053730 2064791 logs.go:282] 0 containers: []
	W1219 06:15:55.053737 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:55.053784 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:55.053857 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:55.080935 2064791 cri.go:92] found id: ""
	I1219 06:15:55.080949 2064791 logs.go:282] 0 containers: []
	W1219 06:15:55.080957 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:55.080962 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:55.081027 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:55.109910 2064791 cri.go:92] found id: ""
	I1219 06:15:55.109925 2064791 logs.go:282] 0 containers: []
	W1219 06:15:55.109932 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:55.109938 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:55.110005 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:55.138372 2064791 cri.go:92] found id: ""
	I1219 06:15:55.138386 2064791 logs.go:282] 0 containers: []
	W1219 06:15:55.138393 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:55.138400 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:55.138463 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:55.172107 2064791 cri.go:92] found id: ""
	I1219 06:15:55.172121 2064791 logs.go:282] 0 containers: []
	W1219 06:15:55.172128 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:55.172133 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:55.172191 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:55.207670 2064791 cri.go:92] found id: ""
	I1219 06:15:55.207684 2064791 logs.go:282] 0 containers: []
	W1219 06:15:55.207690 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:55.207698 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:55.207708 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:55.273955 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:55.273975 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:55.303942 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:55.303960 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:55.367492 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:55.367517 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:55.384909 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:55.384933 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:55.447954 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:55.439722   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:55.440407   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:55.442003   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:55.442525   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:55.444025   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:55.439722   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:55.440407   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:55.442003   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:55.442525   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:55.444025   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:57.948746 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:57.959024 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:57.959084 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:57.984251 2064791 cri.go:92] found id: ""
	I1219 06:15:57.984264 2064791 logs.go:282] 0 containers: []
	W1219 06:15:57.984271 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:57.984277 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:57.984335 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:58.012444 2064791 cri.go:92] found id: ""
	I1219 06:15:58.012459 2064791 logs.go:282] 0 containers: []
	W1219 06:15:58.012467 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:58.012472 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:58.012531 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:58.040674 2064791 cri.go:92] found id: ""
	I1219 06:15:58.040688 2064791 logs.go:282] 0 containers: []
	W1219 06:15:58.040695 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:58.040700 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:58.040783 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:58.066507 2064791 cri.go:92] found id: ""
	I1219 06:15:58.066522 2064791 logs.go:282] 0 containers: []
	W1219 06:15:58.066529 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:58.066535 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:58.066598 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:58.095594 2064791 cri.go:92] found id: ""
	I1219 06:15:58.095608 2064791 logs.go:282] 0 containers: []
	W1219 06:15:58.095615 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:58.095620 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:58.095680 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:58.121624 2064791 cri.go:92] found id: ""
	I1219 06:15:58.121638 2064791 logs.go:282] 0 containers: []
	W1219 06:15:58.121644 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:58.121650 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:58.121707 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:58.149741 2064791 cri.go:92] found id: ""
	I1219 06:15:58.149755 2064791 logs.go:282] 0 containers: []
	W1219 06:15:58.149762 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:58.149770 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:58.149782 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:58.181272 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:58.181288 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:58.240957 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:58.240987 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:58.258044 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:58.258060 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:58.322228 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:58.314484   16407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:58.315163   16407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:58.316700   16407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:58.317166   16407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:58.318619   16407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:58.314484   16407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:58.315163   16407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:58.316700   16407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:58.317166   16407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:58.318619   16407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:58.322239 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:58.322250 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:16:00.885057 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:16:00.895320 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:16:00.895386 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:16:00.919880 2064791 cri.go:92] found id: ""
	I1219 06:16:00.919914 2064791 logs.go:282] 0 containers: []
	W1219 06:16:00.919922 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:16:00.919927 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:16:00.919995 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:16:00.944225 2064791 cri.go:92] found id: ""
	I1219 06:16:00.944238 2064791 logs.go:282] 0 containers: []
	W1219 06:16:00.944245 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:16:00.944250 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:16:00.944316 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:16:00.969895 2064791 cri.go:92] found id: ""
	I1219 06:16:00.969909 2064791 logs.go:282] 0 containers: []
	W1219 06:16:00.969916 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:16:00.969921 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:16:00.969982 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:16:00.994103 2064791 cri.go:92] found id: ""
	I1219 06:16:00.994118 2064791 logs.go:282] 0 containers: []
	W1219 06:16:00.994134 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:16:00.994141 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:16:00.994224 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:16:01.021151 2064791 cri.go:92] found id: ""
	I1219 06:16:01.021166 2064791 logs.go:282] 0 containers: []
	W1219 06:16:01.021172 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:16:01.021181 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:16:01.021244 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:16:01.046747 2064791 cri.go:92] found id: ""
	I1219 06:16:01.046761 2064791 logs.go:282] 0 containers: []
	W1219 06:16:01.046768 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:16:01.046773 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:16:01.046831 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:16:01.071655 2064791 cri.go:92] found id: ""
	I1219 06:16:01.071672 2064791 logs.go:282] 0 containers: []
	W1219 06:16:01.071679 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:16:01.071686 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:16:01.071696 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:16:01.127618 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:16:01.127636 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:16:01.145631 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:16:01.145650 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:16:01.235681 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:16:01.226719   16501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:01.227442   16501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:01.229160   16501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:01.229807   16501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:01.231513   16501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:16:01.226719   16501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:01.227442   16501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:01.229160   16501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:01.229807   16501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:01.231513   16501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:16:01.235691 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:16:01.235703 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:16:01.299234 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:16:01.299254 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:16:03.829050 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:16:03.839364 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:16:03.839436 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:16:03.871023 2064791 cri.go:92] found id: ""
	I1219 06:16:03.871037 2064791 logs.go:282] 0 containers: []
	W1219 06:16:03.871044 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:16:03.871049 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:16:03.871107 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:16:03.895774 2064791 cri.go:92] found id: ""
	I1219 06:16:03.895788 2064791 logs.go:282] 0 containers: []
	W1219 06:16:03.895795 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:16:03.895800 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:16:03.895859 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:16:03.921890 2064791 cri.go:92] found id: ""
	I1219 06:16:03.921904 2064791 logs.go:282] 0 containers: []
	W1219 06:16:03.921911 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:16:03.921916 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:16:03.921978 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:16:03.946705 2064791 cri.go:92] found id: ""
	I1219 06:16:03.946719 2064791 logs.go:282] 0 containers: []
	W1219 06:16:03.946726 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:16:03.946731 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:16:03.946790 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:16:03.972566 2064791 cri.go:92] found id: ""
	I1219 06:16:03.972579 2064791 logs.go:282] 0 containers: []
	W1219 06:16:03.972605 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:16:03.972610 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:16:03.972676 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:16:03.998217 2064791 cri.go:92] found id: ""
	I1219 06:16:03.998232 2064791 logs.go:282] 0 containers: []
	W1219 06:16:03.998239 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:16:03.998245 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:16:03.998311 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:16:04.024748 2064791 cri.go:92] found id: ""
	I1219 06:16:04.024786 2064791 logs.go:282] 0 containers: []
	W1219 06:16:04.024793 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:16:04.024802 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:16:04.024827 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:16:04.089385 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:16:04.089406 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:16:04.120677 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:16:04.120695 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:16:04.178263 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:16:04.178282 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:16:04.201672 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:16:04.201688 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:16:04.272543 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:16:04.263798   16619 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:04.264930   16619 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:04.265587   16619 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:04.267210   16619 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:04.267480   16619 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:16:04.263798   16619 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:04.264930   16619 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:04.265587   16619 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:04.267210   16619 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:04.267480   16619 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:16:06.772819 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:16:06.784042 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:16:06.784119 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:16:06.809087 2064791 cri.go:92] found id: ""
	I1219 06:16:06.809101 2064791 logs.go:282] 0 containers: []
	W1219 06:16:06.809108 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:16:06.809113 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:16:06.809171 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:16:06.833636 2064791 cri.go:92] found id: ""
	I1219 06:16:06.833649 2064791 logs.go:282] 0 containers: []
	W1219 06:16:06.833656 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:16:06.833661 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:16:06.833726 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:16:06.862766 2064791 cri.go:92] found id: ""
	I1219 06:16:06.862781 2064791 logs.go:282] 0 containers: []
	W1219 06:16:06.862788 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:16:06.862797 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:16:06.862858 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:16:06.887915 2064791 cri.go:92] found id: ""
	I1219 06:16:06.887929 2064791 logs.go:282] 0 containers: []
	W1219 06:16:06.887935 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:16:06.887940 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:16:06.888001 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:16:06.913093 2064791 cri.go:92] found id: ""
	I1219 06:16:06.913107 2064791 logs.go:282] 0 containers: []
	W1219 06:16:06.913114 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:16:06.913119 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:16:06.913184 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:16:06.944662 2064791 cri.go:92] found id: ""
	I1219 06:16:06.944677 2064791 logs.go:282] 0 containers: []
	W1219 06:16:06.944695 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:16:06.944700 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:16:06.944796 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:16:06.976908 2064791 cri.go:92] found id: ""
	I1219 06:16:06.976923 2064791 logs.go:282] 0 containers: []
	W1219 06:16:06.976929 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:16:06.976937 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:16:06.976948 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:16:07.041844 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:16:07.041865 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:16:07.071749 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:16:07.071765 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:16:07.130039 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:16:07.130060 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:16:07.147749 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:16:07.147766 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:16:07.226540 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:16:07.218267   16726 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:07.218857   16726 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:07.220373   16726 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:07.220937   16726 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:07.222460   16726 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:16:07.218267   16726 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:07.218857   16726 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:07.220373   16726 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:07.220937   16726 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:07.222460   16726 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:16:09.726802 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:16:09.737347 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:16:09.737408 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:16:09.761740 2064791 cri.go:92] found id: ""
	I1219 06:16:09.761754 2064791 logs.go:282] 0 containers: []
	W1219 06:16:09.761761 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:16:09.761767 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:16:09.761838 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:16:09.787861 2064791 cri.go:92] found id: ""
	I1219 06:16:09.787876 2064791 logs.go:282] 0 containers: []
	W1219 06:16:09.787883 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:16:09.787888 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:16:09.787950 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:16:09.812599 2064791 cri.go:92] found id: ""
	I1219 06:16:09.812613 2064791 logs.go:282] 0 containers: []
	W1219 06:16:09.812620 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:16:09.812625 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:16:09.812687 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:16:09.837573 2064791 cri.go:92] found id: ""
	I1219 06:16:09.837588 2064791 logs.go:282] 0 containers: []
	W1219 06:16:09.837596 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:16:09.837601 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:16:09.837661 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:16:09.861697 2064791 cri.go:92] found id: ""
	I1219 06:16:09.861712 2064791 logs.go:282] 0 containers: []
	W1219 06:16:09.861718 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:16:09.861723 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:16:09.861788 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:16:09.886842 2064791 cri.go:92] found id: ""
	I1219 06:16:09.886856 2064791 logs.go:282] 0 containers: []
	W1219 06:16:09.886872 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:16:09.886884 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:16:09.886956 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:16:09.912372 2064791 cri.go:92] found id: ""
	I1219 06:16:09.912387 2064791 logs.go:282] 0 containers: []
	W1219 06:16:09.912395 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:16:09.912403 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:16:09.912413 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:16:09.971481 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:16:09.971501 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:16:09.989303 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:16:09.989320 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:16:10.067493 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:16:10.058017   16811 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:10.059169   16811 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:10.059962   16811 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:10.061845   16811 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:10.062203   16811 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:16:10.058017   16811 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:10.059169   16811 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:10.059962   16811 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:10.061845   16811 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:10.062203   16811 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:16:10.067504 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:16:10.067517 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:16:10.132042 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:16:10.132062 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:16:12.664804 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:16:12.675466 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:16:12.675550 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:16:12.704963 2064791 cri.go:92] found id: ""
	I1219 06:16:12.704978 2064791 logs.go:282] 0 containers: []
	W1219 06:16:12.704985 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:16:12.704990 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:16:12.705052 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:16:12.730087 2064791 cri.go:92] found id: ""
	I1219 06:16:12.730103 2064791 logs.go:282] 0 containers: []
	W1219 06:16:12.730110 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:16:12.730115 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:16:12.730178 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:16:12.760566 2064791 cri.go:92] found id: ""
	I1219 06:16:12.760595 2064791 logs.go:282] 0 containers: []
	W1219 06:16:12.760602 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:16:12.760608 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:16:12.760675 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:16:12.785694 2064791 cri.go:92] found id: ""
	I1219 06:16:12.785707 2064791 logs.go:282] 0 containers: []
	W1219 06:16:12.785714 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:16:12.785719 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:16:12.785781 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:16:12.811923 2064791 cri.go:92] found id: ""
	I1219 06:16:12.811938 2064791 logs.go:282] 0 containers: []
	W1219 06:16:12.811956 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:16:12.811962 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:16:12.812036 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:16:12.838424 2064791 cri.go:92] found id: ""
	I1219 06:16:12.838438 2064791 logs.go:282] 0 containers: []
	W1219 06:16:12.838445 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:16:12.838451 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:16:12.838514 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:16:12.864177 2064791 cri.go:92] found id: ""
	I1219 06:16:12.864191 2064791 logs.go:282] 0 containers: []
	W1219 06:16:12.864198 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:16:12.864206 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:16:12.864216 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:16:12.920882 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:16:12.920904 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:16:12.937942 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:16:12.937959 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:16:13.004209 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:16:12.994302   16911 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:12.994966   16911 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:12.996691   16911 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:12.997250   16911 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:12.998931   16911 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:16:12.994302   16911 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:12.994966   16911 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:12.996691   16911 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:12.997250   16911 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:12.998931   16911 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:16:13.004223 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:16:13.004247 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:16:13.067051 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:16:13.067071 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:16:15.596451 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:16:15.606953 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:16:15.607013 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:16:15.639546 2064791 cri.go:92] found id: ""
	I1219 06:16:15.639560 2064791 logs.go:282] 0 containers: []
	W1219 06:16:15.639569 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:16:15.639574 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:16:15.639637 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:16:15.667230 2064791 cri.go:92] found id: ""
	I1219 06:16:15.667245 2064791 logs.go:282] 0 containers: []
	W1219 06:16:15.667252 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:16:15.667257 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:16:15.667321 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:16:15.693059 2064791 cri.go:92] found id: ""
	I1219 06:16:15.693073 2064791 logs.go:282] 0 containers: []
	W1219 06:16:15.693080 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:16:15.693086 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:16:15.693145 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:16:15.718341 2064791 cri.go:92] found id: ""
	I1219 06:16:15.718356 2064791 logs.go:282] 0 containers: []
	W1219 06:16:15.718363 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:16:15.718368 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:16:15.718437 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:16:15.744544 2064791 cri.go:92] found id: ""
	I1219 06:16:15.744559 2064791 logs.go:282] 0 containers: []
	W1219 06:16:15.744566 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:16:15.744571 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:16:15.744632 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:16:15.769809 2064791 cri.go:92] found id: ""
	I1219 06:16:15.769823 2064791 logs.go:282] 0 containers: []
	W1219 06:16:15.769830 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:16:15.769836 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:16:15.769897 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:16:15.793872 2064791 cri.go:92] found id: ""
	I1219 06:16:15.793887 2064791 logs.go:282] 0 containers: []
	W1219 06:16:15.793894 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:16:15.793902 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:16:15.793914 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:16:15.811209 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:16:15.811228 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:16:15.875495 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:16:15.867475   17013 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:15.868031   17013 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:15.869581   17013 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:15.870044   17013 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:15.871521   17013 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:16:15.867475   17013 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:15.868031   17013 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:15.869581   17013 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:15.870044   17013 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:15.871521   17013 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:16:15.875504 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:16:15.875516 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:16:15.938869 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:16:15.938889 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:16:15.967183 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:16:15.967200 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:16:18.524056 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:16:18.534213 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:16:18.534283 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:16:18.558904 2064791 cri.go:92] found id: ""
	I1219 06:16:18.558918 2064791 logs.go:282] 0 containers: []
	W1219 06:16:18.558924 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:16:18.558929 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:16:18.558994 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:16:18.583638 2064791 cri.go:92] found id: ""
	I1219 06:16:18.583653 2064791 logs.go:282] 0 containers: []
	W1219 06:16:18.583661 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:16:18.583666 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:16:18.583726 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:16:18.611047 2064791 cri.go:92] found id: ""
	I1219 06:16:18.611061 2064791 logs.go:282] 0 containers: []
	W1219 06:16:18.611068 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:16:18.611073 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:16:18.611133 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:16:18.635234 2064791 cri.go:92] found id: ""
	I1219 06:16:18.635248 2064791 logs.go:282] 0 containers: []
	W1219 06:16:18.635255 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:16:18.635261 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:16:18.635322 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:16:18.658732 2064791 cri.go:92] found id: ""
	I1219 06:16:18.658747 2064791 logs.go:282] 0 containers: []
	W1219 06:16:18.658754 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:16:18.658759 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:16:18.658819 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:16:18.687782 2064791 cri.go:92] found id: ""
	I1219 06:16:18.687796 2064791 logs.go:282] 0 containers: []
	W1219 06:16:18.687803 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:16:18.687808 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:16:18.687871 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:16:18.713641 2064791 cri.go:92] found id: ""
	I1219 06:16:18.713655 2064791 logs.go:282] 0 containers: []
	W1219 06:16:18.713662 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:16:18.713670 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:16:18.713687 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:16:18.730768 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:16:18.730786 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:16:18.797385 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:16:18.788999   17117 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:18.789629   17117 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:18.791299   17117 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:18.791871   17117 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:18.793492   17117 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:16:18.788999   17117 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:18.789629   17117 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:18.791299   17117 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:18.791871   17117 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:18.793492   17117 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:16:18.797396 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:16:18.797406 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:16:18.861009 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:16:18.861029 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:16:18.889085 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:16:18.889102 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:16:21.448880 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:16:21.458996 2064791 kubeadm.go:602] duration metric: took 4m4.592886052s to restartPrimaryControlPlane
	W1219 06:16:21.459078 2064791 out.go:285] ! Unable to restart control-plane node(s), will reset cluster: <no value>
	I1219 06:16:21.459152 2064791 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1219 06:16:21.873036 2064791 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1219 06:16:21.887075 2064791 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1219 06:16:21.894868 2064791 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1219 06:16:21.894925 2064791 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1219 06:16:21.902909 2064791 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1219 06:16:21.902919 2064791 kubeadm.go:158] found existing configuration files:
	
	I1219 06:16:21.902973 2064791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1219 06:16:21.912282 2064791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1219 06:16:21.912342 2064791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1219 06:16:21.920310 2064791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1219 06:16:21.928090 2064791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1219 06:16:21.928158 2064791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1219 06:16:21.935829 2064791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1219 06:16:21.944085 2064791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1219 06:16:21.944143 2064791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1219 06:16:21.951866 2064791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1219 06:16:21.959883 2064791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1219 06:16:21.959950 2064791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1219 06:16:21.967628 2064791 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1219 06:16:22.006002 2064791 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-rc.1
	I1219 06:16:22.006076 2064791 kubeadm.go:319] [preflight] Running pre-flight checks
	I1219 06:16:22.084826 2064791 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1219 06:16:22.084890 2064791 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1219 06:16:22.084925 2064791 kubeadm.go:319] OS: Linux
	I1219 06:16:22.084969 2064791 kubeadm.go:319] CGROUPS_CPU: enabled
	I1219 06:16:22.085017 2064791 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1219 06:16:22.085068 2064791 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1219 06:16:22.085115 2064791 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1219 06:16:22.085163 2064791 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1219 06:16:22.085209 2064791 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1219 06:16:22.085254 2064791 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1219 06:16:22.085302 2064791 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1219 06:16:22.085348 2064791 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1219 06:16:22.154531 2064791 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1219 06:16:22.154670 2064791 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1219 06:16:22.154781 2064791 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1219 06:16:22.163477 2064791 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1219 06:16:22.169007 2064791 out.go:252]   - Generating certificates and keys ...
	I1219 06:16:22.169099 2064791 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1219 06:16:22.169162 2064791 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1219 06:16:22.169237 2064791 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1219 06:16:22.169297 2064791 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1219 06:16:22.169372 2064791 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1219 06:16:22.169426 2064791 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1219 06:16:22.169488 2064791 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1219 06:16:22.169549 2064791 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1219 06:16:22.169633 2064791 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1219 06:16:22.169704 2064791 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1219 06:16:22.169741 2064791 kubeadm.go:319] [certs] Using the existing "sa" key
	I1219 06:16:22.169795 2064791 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1219 06:16:22.320644 2064791 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1219 06:16:22.743805 2064791 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1219 06:16:22.867878 2064791 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1219 06:16:22.974729 2064791 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1219 06:16:23.395365 2064791 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1219 06:16:23.396030 2064791 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1219 06:16:23.399355 2064791 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1219 06:16:23.402464 2064791 out.go:252]   - Booting up control plane ...
	I1219 06:16:23.402561 2064791 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1219 06:16:23.402637 2064791 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1219 06:16:23.403521 2064791 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1219 06:16:23.423590 2064791 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1219 06:16:23.423990 2064791 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1219 06:16:23.431661 2064791 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1219 06:16:23.431897 2064791 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1219 06:16:23.432074 2064791 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1219 06:16:23.567443 2064791 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1219 06:16:23.567557 2064791 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1219 06:20:23.567966 2064791 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.000561406s
	I1219 06:20:23.567991 2064791 kubeadm.go:319] 
	I1219 06:20:23.568084 2064791 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1219 06:20:23.568128 2064791 kubeadm.go:319] 	- The kubelet is not running
	I1219 06:20:23.568239 2064791 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1219 06:20:23.568244 2064791 kubeadm.go:319] 
	I1219 06:20:23.568354 2064791 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1219 06:20:23.568390 2064791 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1219 06:20:23.568420 2064791 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1219 06:20:23.568423 2064791 kubeadm.go:319] 
	I1219 06:20:23.572732 2064791 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1219 06:20:23.573205 2064791 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1219 06:20:23.573348 2064791 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1219 06:20:23.573651 2064791 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	I1219 06:20:23.573656 2064791 kubeadm.go:319] 
	W1219 06:20:23.573846 2064791 out.go:285] ! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000561406s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	
	To see the stack trace of this error execute with --v=5 or higher
	
	I1219 06:20:23.573948 2064791 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1219 06:20:23.574218 2064791 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1219 06:20:23.984042 2064791 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1219 06:20:23.997740 2064791 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1219 06:20:23.997798 2064791 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1219 06:20:24.008638 2064791 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1219 06:20:24.008649 2064791 kubeadm.go:158] found existing configuration files:
	
	I1219 06:20:24.008724 2064791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1219 06:20:24.018051 2064791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1219 06:20:24.018112 2064791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1219 06:20:24.026089 2064791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1219 06:20:24.034468 2064791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1219 06:20:24.034524 2064791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1219 06:20:24.042330 2064791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1219 06:20:24.050325 2064791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1219 06:20:24.050390 2064791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1219 06:20:24.058263 2064791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1219 06:20:24.066872 2064791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1219 06:20:24.066933 2064791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1219 06:20:24.075206 2064791 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1219 06:20:24.113532 2064791 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-rc.1
	I1219 06:20:24.113595 2064791 kubeadm.go:319] [preflight] Running pre-flight checks
	I1219 06:20:24.190273 2064791 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1219 06:20:24.190347 2064791 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1219 06:20:24.190399 2064791 kubeadm.go:319] OS: Linux
	I1219 06:20:24.190447 2064791 kubeadm.go:319] CGROUPS_CPU: enabled
	I1219 06:20:24.190497 2064791 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1219 06:20:24.190547 2064791 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1219 06:20:24.190597 2064791 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1219 06:20:24.190648 2064791 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1219 06:20:24.190697 2064791 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1219 06:20:24.190745 2064791 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1219 06:20:24.190796 2064791 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1219 06:20:24.190844 2064791 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1219 06:20:24.261095 2064791 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1219 06:20:24.261198 2064791 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1219 06:20:24.261287 2064791 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1219 06:20:24.273343 2064791 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1219 06:20:24.278556 2064791 out.go:252]   - Generating certificates and keys ...
	I1219 06:20:24.278645 2064791 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1219 06:20:24.278707 2064791 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1219 06:20:24.278781 2064791 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1219 06:20:24.278840 2064791 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1219 06:20:24.278908 2064791 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1219 06:20:24.278961 2064791 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1219 06:20:24.279023 2064791 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1219 06:20:24.279082 2064791 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1219 06:20:24.279155 2064791 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1219 06:20:24.279227 2064791 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1219 06:20:24.279263 2064791 kubeadm.go:319] [certs] Using the existing "sa" key
	I1219 06:20:24.279319 2064791 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1219 06:20:24.586742 2064791 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1219 06:20:24.705000 2064791 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1219 06:20:25.117117 2064791 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1219 06:20:25.207046 2064791 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1219 06:20:25.407003 2064791 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1219 06:20:25.408181 2064791 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1219 06:20:25.412332 2064791 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1219 06:20:25.415422 2064791 out.go:252]   - Booting up control plane ...
	I1219 06:20:25.415519 2064791 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1219 06:20:25.415596 2064791 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1219 06:20:25.415664 2064791 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1219 06:20:25.435196 2064791 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1219 06:20:25.435555 2064791 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1219 06:20:25.442782 2064791 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1219 06:20:25.443056 2064791 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1219 06:20:25.443098 2064791 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1219 06:20:25.586740 2064791 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1219 06:20:25.586852 2064791 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1219 06:24:25.586924 2064791 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.000209622s
	I1219 06:24:25.586949 2064791 kubeadm.go:319] 
	I1219 06:24:25.587005 2064791 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1219 06:24:25.587037 2064791 kubeadm.go:319] 	- The kubelet is not running
	I1219 06:24:25.587152 2064791 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1219 06:24:25.587157 2064791 kubeadm.go:319] 
	I1219 06:24:25.587305 2064791 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1219 06:24:25.587351 2064791 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1219 06:24:25.587399 2064791 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1219 06:24:25.587405 2064791 kubeadm.go:319] 
	I1219 06:24:25.592745 2064791 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1219 06:24:25.593206 2064791 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1219 06:24:25.593358 2064791 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1219 06:24:25.593654 2064791 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	I1219 06:24:25.593660 2064791 kubeadm.go:319] 
	I1219 06:24:25.593751 2064791 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1219 06:24:25.593818 2064791 kubeadm.go:403] duration metric: took 12m8.761907578s to StartCluster
	I1219 06:24:25.593849 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:24:25.593915 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:24:25.619076 2064791 cri.go:92] found id: ""
	I1219 06:24:25.619090 2064791 logs.go:282] 0 containers: []
	W1219 06:24:25.619097 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:24:25.619103 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:24:25.619166 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:24:25.645501 2064791 cri.go:92] found id: ""
	I1219 06:24:25.645515 2064791 logs.go:282] 0 containers: []
	W1219 06:24:25.645522 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:24:25.645527 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:24:25.645587 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:24:25.671211 2064791 cri.go:92] found id: ""
	I1219 06:24:25.671225 2064791 logs.go:282] 0 containers: []
	W1219 06:24:25.671232 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:24:25.671237 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:24:25.671297 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:24:25.695076 2064791 cri.go:92] found id: ""
	I1219 06:24:25.695090 2064791 logs.go:282] 0 containers: []
	W1219 06:24:25.695098 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:24:25.695104 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:24:25.695165 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:24:25.720717 2064791 cri.go:92] found id: ""
	I1219 06:24:25.720733 2064791 logs.go:282] 0 containers: []
	W1219 06:24:25.720740 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:24:25.720745 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:24:25.720832 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:24:25.746445 2064791 cri.go:92] found id: ""
	I1219 06:24:25.746460 2064791 logs.go:282] 0 containers: []
	W1219 06:24:25.746466 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:24:25.746478 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:24:25.746541 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:24:25.771217 2064791 cri.go:92] found id: ""
	I1219 06:24:25.771231 2064791 logs.go:282] 0 containers: []
	W1219 06:24:25.771238 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:24:25.771249 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:24:25.771259 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:24:25.827848 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:24:25.827867 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:24:25.845454 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:24:25.845470 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:24:25.916464 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:24:25.906852   20933 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:24:25.907635   20933 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:24:25.909247   20933 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:24:25.909952   20933 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:24:25.911845   20933 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:24:25.906852   20933 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:24:25.907635   20933 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:24:25.909247   20933 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:24:25.909952   20933 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:24:25.911845   20933 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:24:25.916485 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:24:25.916495 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:24:25.988149 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:24:25.988168 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	W1219 06:24:26.019538 2064791 out.go:434] Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000209622s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	
	To see the stack trace of this error execute with --v=5 or higher
	W1219 06:24:26.019579 2064791 out.go:285] * 
	W1219 06:24:26.019696 2064791 out.go:285] X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000209622s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1219 06:24:26.019769 2064791 out.go:285] * 
	W1219 06:24:26.022296 2064791 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1219 06:24:26.028311 2064791 out.go:203] 
	W1219 06:24:26.031204 2064791 out.go:285] X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000209622s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1219 06:24:26.031251 2064791 out.go:285] * Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	W1219 06:24:26.031270 2064791 out.go:285] * Related issue: https://github.com/kubernetes/minikube/issues/4172
	I1219 06:24:26.034280 2064791 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.483798627Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.483867223Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.483965234Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.484038564Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.484104559Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.484166960Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.484226562Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.484289119Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.484361021Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.484452469Z" level=info msg="Connect containerd service"
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.484896289Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this"
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.485577404Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config"
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.498876654Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.499249089Z" level=info msg=serving... address=/run/containerd/containerd.sock
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.498953709Z" level=info msg="Start subscribing containerd event"
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.499359457Z" level=info msg="Start recovering state"
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.541527820Z" level=info msg="Start event monitor"
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.541744389Z" level=info msg="Start cni network conf syncer for default"
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.541814527Z" level=info msg="Start streaming server"
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.541876723Z" level=info msg="Registered namespace \"k8s.io\" with NRI"
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.541934873Z" level=info msg="runtime interface starting up..."
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.541989897Z" level=info msg="starting plugins..."
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.542066690Z" level=info msg="Synchronizing NRI (plugin) with current runtime state"
	Dec 19 06:12:15 functional-006924 systemd[1]: Started containerd.service - containerd container runtime.
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.544548112Z" level=info msg="containerd successfully booted in 0.093860s"
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:24:27.267212   21054 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:24:27.267845   21054 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:24:27.270225   21054 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:24:27.271072   21054 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:24:27.271831   21054 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec19 04:47] overlayfs: idmapped layers are currently not supported
	[Dec19 04:48] overlayfs: idmapped layers are currently not supported
	[Dec19 04:49] overlayfs: idmapped layers are currently not supported
	[Dec19 04:51] overlayfs: idmapped layers are currently not supported
	[Dec19 04:53] overlayfs: idmapped layers are currently not supported
	[Dec19 05:03] overlayfs: idmapped layers are currently not supported
	[Dec19 05:04] overlayfs: idmapped layers are currently not supported
	[Dec19 05:05] overlayfs: idmapped layers are currently not supported
	[Dec19 05:06] overlayfs: idmapped layers are currently not supported
	[ +12.793339] overlayfs: idmapped layers are currently not supported
	[Dec19 05:07] overlayfs: idmapped layers are currently not supported
	[Dec19 05:08] overlayfs: idmapped layers are currently not supported
	[Dec19 05:09] overlayfs: idmapped layers are currently not supported
	[Dec19 05:10] overlayfs: idmapped layers are currently not supported
	[Dec19 05:11] overlayfs: idmapped layers are currently not supported
	[Dec19 05:13] overlayfs: idmapped layers are currently not supported
	[Dec19 05:14] overlayfs: idmapped layers are currently not supported
	[Dec19 05:32] overlayfs: idmapped layers are currently not supported
	[Dec19 05:33] overlayfs: idmapped layers are currently not supported
	[Dec19 05:35] overlayfs: idmapped layers are currently not supported
	[Dec19 05:36] overlayfs: idmapped layers are currently not supported
	[Dec19 05:38] overlayfs: idmapped layers are currently not supported
	[Dec19 05:39] overlayfs: idmapped layers are currently not supported
	[Dec19 05:40] overlayfs: idmapped layers are currently not supported
	[Dec19 05:42] kauditd_printk_skb: 8 callbacks suppressed
	
	
	==> kernel <==
	 06:24:27 up 11:06,  0 user,  load average: 0.20, 0.18, 0.44
	Linux functional-006924 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 19 06:24:23 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 06:24:24 functional-006924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 318.
	Dec 19 06:24:24 functional-006924 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:24:24 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:24:24 functional-006924 kubelet[20857]: E1219 06:24:24.446436   20857 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 06:24:24 functional-006924 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 06:24:24 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 06:24:25 functional-006924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 319.
	Dec 19 06:24:25 functional-006924 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:24:25 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:24:25 functional-006924 kubelet[20863]: E1219 06:24:25.196171   20863 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 06:24:25 functional-006924 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 06:24:25 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 06:24:25 functional-006924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 320.
	Dec 19 06:24:25 functional-006924 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:24:25 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:24:25 functional-006924 kubelet[20937]: E1219 06:24:25.962886   20937 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 06:24:25 functional-006924 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 06:24:25 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 06:24:26 functional-006924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 321.
	Dec 19 06:24:26 functional-006924 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:24:26 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:24:26 functional-006924 kubelet[20971]: E1219 06:24:26.716391   20971 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 06:24:26 functional-006924 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 06:24:26 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:263: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-006924 -n functional-006924
helpers_test.go:263: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-006924 -n functional-006924: exit status 2 (370.830066ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:263: status error: exit status 2 (may be ok)
helpers_test.go:265: "functional-006924" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ExtraConfig (735.64s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ComponentHealth (2.29s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ComponentHealth
functional_test.go:825: (dbg) Run:  kubectl --context functional-006924 get po -l tier=control-plane -n kube-system -o=json
functional_test.go:825: (dbg) Non-zero exit: kubectl --context functional-006924 get po -l tier=control-plane -n kube-system -o=json: exit status 1 (59.253955ms)

                                                
                                                
-- stdout --
	{
	    "apiVersion": "v1",
	    "items": [],
	    "kind": "List",
	    "metadata": {
	        "resourceVersion": ""
	    }
	}

                                                
                                                
-- /stdout --
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:827: failed to get components. args "kubectl --context functional-006924 get po -l tier=control-plane -n kube-system -o=json": exit status 1
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:224: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ComponentHealth]: network settings <======
helpers_test.go:231: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:239: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ComponentHealth]: docker inspect <======
helpers_test.go:240: (dbg) Run:  docker inspect functional-006924
helpers_test.go:244: (dbg) docker inspect functional-006924:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6",
	        "Created": "2025-12-19T05:57:32.987616309Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 2053574,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-19T05:57:33.050252475Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:1411dfa4fea1291ce69fcd55acb99f3fbff3e701cee30fdd4f0b2561ac0ef6b0",
	        "ResolvConfPath": "/var/lib/docker/containers/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6/hostname",
	        "HostsPath": "/var/lib/docker/containers/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6/hosts",
	        "LogPath": "/var/lib/docker/containers/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6-json.log",
	        "Name": "/functional-006924",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-006924:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-006924",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6",
	                "LowerDir": "/var/lib/docker/overlay2/37e64f46ab514c62069e4338bcac4816059c7746e8935882d7526ec5f3a01f73-init/diff:/var/lib/docker/overlay2/00358d85eab3b52f9d297862c5ac97673efd866f7bb8f8781bf0c1744f50abc5/diff",
	                "MergedDir": "/var/lib/docker/overlay2/37e64f46ab514c62069e4338bcac4816059c7746e8935882d7526ec5f3a01f73/merged",
	                "UpperDir": "/var/lib/docker/overlay2/37e64f46ab514c62069e4338bcac4816059c7746e8935882d7526ec5f3a01f73/diff",
	                "WorkDir": "/var/lib/docker/overlay2/37e64f46ab514c62069e4338bcac4816059c7746e8935882d7526ec5f3a01f73/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-006924",
	                "Source": "/var/lib/docker/volumes/functional-006924/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-006924",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-006924",
	                "name.minikube.sigs.k8s.io": "functional-006924",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "c06ab2bd44169716d410789ed39ed6e7c04e20cbf7fddb96691439282b9c97ca",
	            "SandboxKey": "/var/run/docker/netns/c06ab2bd4416",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34704"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34705"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34708"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34706"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34707"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-006924": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "42:2f:87:6a:a8:7b",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "f63e8dc2cff83663f8a4d14108f192e61e457410fa4fc720cd9630dbf354815d",
	                    "EndpointID": "aa2b1cbd90d5c1f6130481423d97f82d974d4197e41ad0dbe3b7e51b22c8b4cc",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-006924",
	                        "651d0d6ef1db"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:248: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-006924 -n functional-006924
helpers_test.go:248: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-006924 -n functional-006924: exit status 2 (305.529622ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:248: status error: exit status 2 (may be ok)
helpers_test.go:253: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ComponentHealth FAILED: start of post-mortem logs <<<
helpers_test.go:254: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ComponentHealth]: minikube logs <======
helpers_test.go:256: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 logs -n 25
helpers_test.go:261: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ComponentHealth logs: 
-- stdout --
	
	==> Audit <==
	┌────────────────┬───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│    COMMAND     │                                                                         ARGS                                                                          │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├────────────────┼───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ image          │ functional-125117 image ls                                                                                                                            │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image ls --format json --alsologtostderr                                                                                            │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ image          │ functional-125117 image ls --format table --alsologtostderr                                                                                           │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ update-context │ functional-125117 update-context --alsologtostderr -v=2                                                                                               │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ update-context │ functional-125117 update-context --alsologtostderr -v=2                                                                                               │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ update-context │ functional-125117 update-context --alsologtostderr -v=2                                                                                               │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ delete         │ -p functional-125117                                                                                                                                  │ functional-125117 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │ 19 Dec 25 05:57 UTC │
	│ start          │ -p functional-006924 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-rc.1 │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 05:57 UTC │                     │
	│ start          │ -p functional-006924 --alsologtostderr -v=8                                                                                                           │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:05 UTC │                     │
	│ cache          │ functional-006924 cache add registry.k8s.io/pause:3.1                                                                                                 │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ cache          │ functional-006924 cache add registry.k8s.io/pause:3.3                                                                                                 │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ cache          │ functional-006924 cache add registry.k8s.io/pause:latest                                                                                              │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ cache          │ functional-006924 cache add minikube-local-cache-test:functional-006924                                                                               │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ cache          │ functional-006924 cache delete minikube-local-cache-test:functional-006924                                                                            │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ cache          │ delete registry.k8s.io/pause:3.3                                                                                                                      │ minikube          │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ cache          │ list                                                                                                                                                  │ minikube          │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ ssh            │ functional-006924 ssh sudo crictl images                                                                                                              │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ ssh            │ functional-006924 ssh sudo crictl rmi registry.k8s.io/pause:latest                                                                                    │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ ssh            │ functional-006924 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                               │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │                     │
	│ cache          │ functional-006924 cache reload                                                                                                                        │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ ssh            │ functional-006924 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                               │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ cache          │ delete registry.k8s.io/pause:3.1                                                                                                                      │ minikube          │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ cache          │ delete registry.k8s.io/pause:latest                                                                                                                   │ minikube          │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ kubectl        │ functional-006924 kubectl -- --context functional-006924 get pods                                                                                     │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │                     │
	│ start          │ -p functional-006924 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all                                              │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │                     │
	└────────────────┴───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/19 06:12:12
	Running on machine: ip-172-31-30-239
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1219 06:12:12.743158 2064791 out.go:360] Setting OutFile to fd 1 ...
	I1219 06:12:12.743269 2064791 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 06:12:12.743273 2064791 out.go:374] Setting ErrFile to fd 2...
	I1219 06:12:12.743277 2064791 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 06:12:12.743528 2064791 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22230-1998525/.minikube/bin
	I1219 06:12:12.743902 2064791 out.go:368] Setting JSON to false
	I1219 06:12:12.744837 2064791 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":39279,"bootTime":1766085454,"procs":155,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1219 06:12:12.744896 2064791 start.go:143] virtualization:  
	I1219 06:12:12.748217 2064791 out.go:179] * [functional-006924] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1219 06:12:12.751238 2064791 out.go:179]   - MINIKUBE_LOCATION=22230
	I1219 06:12:12.751295 2064791 notify.go:221] Checking for updates...
	I1219 06:12:12.757153 2064791 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1219 06:12:12.760103 2064791 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22230-1998525/kubeconfig
	I1219 06:12:12.763068 2064791 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22230-1998525/.minikube
	I1219 06:12:12.765948 2064791 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1219 06:12:12.768902 2064791 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1219 06:12:12.772437 2064791 config.go:182] Loaded profile config "functional-006924": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1219 06:12:12.772538 2064791 driver.go:422] Setting default libvirt URI to qemu:///system
	I1219 06:12:12.804424 2064791 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1219 06:12:12.804525 2064791 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1219 06:12:12.859954 2064791 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-12-19 06:12:12.850685523 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1219 06:12:12.860047 2064791 docker.go:319] overlay module found
	I1219 06:12:12.863098 2064791 out.go:179] * Using the docker driver based on existing profile
	I1219 06:12:12.866014 2064791 start.go:309] selected driver: docker
	I1219 06:12:12.866030 2064791 start.go:928] validating driver "docker" against &{Name:functional-006924 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APISer
verName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableC
oreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1219 06:12:12.866122 2064791 start.go:939] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1219 06:12:12.866232 2064791 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1219 06:12:12.920329 2064791 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-12-19 06:12:12.911575892 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1219 06:12:12.920732 2064791 start_flags.go:993] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1219 06:12:12.920793 2064791 cni.go:84] Creating CNI manager for ""
	I1219 06:12:12.920848 2064791 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1219 06:12:12.920889 2064791 start.go:353] cluster config:
	{Name:functional-006924 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Con
tainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCo
reDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1219 06:12:12.924076 2064791 out.go:179] * Starting "functional-006924" primary control-plane node in "functional-006924" cluster
	I1219 06:12:12.926767 2064791 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1219 06:12:12.929823 2064791 out.go:179] * Pulling base image v0.0.48-1765966054-22186 ...
	I1219 06:12:12.932605 2064791 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1219 06:12:12.932642 2064791 preload.go:203] Found local preload: /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4
	I1219 06:12:12.932650 2064791 cache.go:65] Caching tarball of preloaded images
	I1219 06:12:12.932677 2064791 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon
	I1219 06:12:12.932745 2064791 preload.go:238] Found /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4 in cache, skipping download
	I1219 06:12:12.932796 2064791 cache.go:68] Finished verifying existence of preloaded tar for v1.35.0-rc.1 on containerd
	I1219 06:12:12.932911 2064791 profile.go:143] Saving config to /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/config.json ...
	I1219 06:12:12.951789 2064791 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon, skipping pull
	I1219 06:12:12.951800 2064791 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 exists in daemon, skipping load
	I1219 06:12:12.951830 2064791 cache.go:243] Successfully downloaded all kic artifacts
	I1219 06:12:12.951863 2064791 start.go:360] acquireMachinesLock for functional-006924: {Name:mkc84f48e83d18024791d45db780f3ccd746613a Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1219 06:12:12.951927 2064791 start.go:364] duration metric: took 47.033µs to acquireMachinesLock for "functional-006924"
	I1219 06:12:12.951947 2064791 start.go:96] Skipping create...Using existing machine configuration
	I1219 06:12:12.951951 2064791 fix.go:54] fixHost starting: 
	I1219 06:12:12.952210 2064791 cli_runner.go:164] Run: docker container inspect functional-006924 --format={{.State.Status}}
	I1219 06:12:12.969279 2064791 fix.go:112] recreateIfNeeded on functional-006924: state=Running err=<nil>
	W1219 06:12:12.969299 2064791 fix.go:138] unexpected machine state, will restart: <nil>
	I1219 06:12:12.972432 2064791 out.go:252] * Updating the running docker "functional-006924" container ...
	I1219 06:12:12.972457 2064791 machine.go:94] provisionDockerMachine start ...
	I1219 06:12:12.972536 2064791 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:12:12.989705 2064791 main.go:144] libmachine: Using SSH client type: native
	I1219 06:12:12.990045 2064791 main.go:144] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db5e0] 0x3ddae0 <nil>  [] 0s} 127.0.0.1 34704 <nil> <nil>}
	I1219 06:12:12.990052 2064791 main.go:144] libmachine: About to run SSH command:
	hostname
	I1219 06:12:13.144528 2064791 main.go:144] libmachine: SSH cmd err, output: <nil>: functional-006924
	
	I1219 06:12:13.144543 2064791 ubuntu.go:182] provisioning hostname "functional-006924"
	I1219 06:12:13.144626 2064791 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:12:13.163735 2064791 main.go:144] libmachine: Using SSH client type: native
	I1219 06:12:13.164043 2064791 main.go:144] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db5e0] 0x3ddae0 <nil>  [] 0s} 127.0.0.1 34704 <nil> <nil>}
	I1219 06:12:13.164057 2064791 main.go:144] libmachine: About to run SSH command:
	sudo hostname functional-006924 && echo "functional-006924" | sudo tee /etc/hostname
	I1219 06:12:13.331538 2064791 main.go:144] libmachine: SSH cmd err, output: <nil>: functional-006924
	
	I1219 06:12:13.331610 2064791 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:12:13.350490 2064791 main.go:144] libmachine: Using SSH client type: native
	I1219 06:12:13.350800 2064791 main.go:144] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db5e0] 0x3ddae0 <nil>  [] 0s} 127.0.0.1 34704 <nil> <nil>}
	I1219 06:12:13.350813 2064791 main.go:144] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-006924' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-006924/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-006924' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1219 06:12:13.509192 2064791 main.go:144] libmachine: SSH cmd err, output: <nil>: 
	I1219 06:12:13.509210 2064791 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/22230-1998525/.minikube CaCertPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/22230-1998525/.minikube}
	I1219 06:12:13.509245 2064791 ubuntu.go:190] setting up certificates
	I1219 06:12:13.509254 2064791 provision.go:84] configureAuth start
	I1219 06:12:13.509315 2064791 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-006924
	I1219 06:12:13.528067 2064791 provision.go:143] copyHostCerts
	I1219 06:12:13.528151 2064791 exec_runner.go:144] found /home/jenkins/minikube-integration/22230-1998525/.minikube/key.pem, removing ...
	I1219 06:12:13.528164 2064791 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22230-1998525/.minikube/key.pem
	I1219 06:12:13.528239 2064791 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/22230-1998525/.minikube/key.pem (1671 bytes)
	I1219 06:12:13.528339 2064791 exec_runner.go:144] found /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.pem, removing ...
	I1219 06:12:13.528348 2064791 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.pem
	I1219 06:12:13.528375 2064791 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.pem (1078 bytes)
	I1219 06:12:13.528452 2064791 exec_runner.go:144] found /home/jenkins/minikube-integration/22230-1998525/.minikube/cert.pem, removing ...
	I1219 06:12:13.528456 2064791 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22230-1998525/.minikube/cert.pem
	I1219 06:12:13.528480 2064791 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/22230-1998525/.minikube/cert.pem (1123 bytes)
	I1219 06:12:13.528529 2064791 provision.go:117] generating server cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca-key.pem org=jenkins.functional-006924 san=[127.0.0.1 192.168.49.2 functional-006924 localhost minikube]
	I1219 06:12:13.839797 2064791 provision.go:177] copyRemoteCerts
	I1219 06:12:13.839849 2064791 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1219 06:12:13.839888 2064791 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:12:13.857134 2064791 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:12:13.968475 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1219 06:12:13.985747 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1219 06:12:14.005527 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1219 06:12:14.024925 2064791 provision.go:87] duration metric: took 515.64823ms to configureAuth
	I1219 06:12:14.024943 2064791 ubuntu.go:206] setting minikube options for container-runtime
	I1219 06:12:14.025140 2064791 config.go:182] Loaded profile config "functional-006924": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1219 06:12:14.025146 2064791 machine.go:97] duration metric: took 1.052684031s to provisionDockerMachine
	I1219 06:12:14.025152 2064791 start.go:293] postStartSetup for "functional-006924" (driver="docker")
	I1219 06:12:14.025162 2064791 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1219 06:12:14.025218 2064791 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1219 06:12:14.025263 2064791 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:12:14.043178 2064791 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:12:14.148605 2064791 ssh_runner.go:195] Run: cat /etc/os-release
	I1219 06:12:14.151719 2064791 main.go:144] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1219 06:12:14.151753 2064791 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1219 06:12:14.151766 2064791 filesync.go:126] Scanning /home/jenkins/minikube-integration/22230-1998525/.minikube/addons for local assets ...
	I1219 06:12:14.151823 2064791 filesync.go:126] Scanning /home/jenkins/minikube-integration/22230-1998525/.minikube/files for local assets ...
	I1219 06:12:14.151902 2064791 filesync.go:149] local asset: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem -> 20003862.pem in /etc/ssl/certs
	I1219 06:12:14.151975 2064791 filesync.go:149] local asset: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/test/nested/copy/2000386/hosts -> hosts in /etc/test/nested/copy/2000386
	I1219 06:12:14.152026 2064791 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/2000386
	I1219 06:12:14.159336 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem --> /etc/ssl/certs/20003862.pem (1708 bytes)
	I1219 06:12:14.177055 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/test/nested/copy/2000386/hosts --> /etc/test/nested/copy/2000386/hosts (40 bytes)
	I1219 06:12:14.195053 2064791 start.go:296] duration metric: took 169.886807ms for postStartSetup
	I1219 06:12:14.195138 2064791 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1219 06:12:14.195175 2064791 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:12:14.212871 2064791 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:12:14.317767 2064791 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1219 06:12:14.322386 2064791 fix.go:56] duration metric: took 1.37042768s for fixHost
	I1219 06:12:14.322401 2064791 start.go:83] releasing machines lock for "functional-006924", held for 1.370467196s
	I1219 06:12:14.322474 2064791 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-006924
	I1219 06:12:14.339208 2064791 ssh_runner.go:195] Run: cat /version.json
	I1219 06:12:14.339250 2064791 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:12:14.339514 2064791 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1219 06:12:14.339574 2064791 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:12:14.363989 2064791 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:12:14.366009 2064791 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:12:14.468260 2064791 ssh_runner.go:195] Run: systemctl --version
	I1219 06:12:14.559810 2064791 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1219 06:12:14.563901 2064791 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1219 06:12:14.563968 2064791 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1219 06:12:14.571453 2064791 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1219 06:12:14.571466 2064791 start.go:496] detecting cgroup driver to use...
	I1219 06:12:14.571496 2064791 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1219 06:12:14.571541 2064791 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1219 06:12:14.588970 2064791 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1219 06:12:14.603919 2064791 docker.go:218] disabling cri-docker service (if available) ...
	I1219 06:12:14.603971 2064791 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1219 06:12:14.620412 2064791 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1219 06:12:14.634912 2064791 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1219 06:12:14.757018 2064791 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1219 06:12:14.879281 2064791 docker.go:234] disabling docker service ...
	I1219 06:12:14.879341 2064791 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1219 06:12:14.894279 2064791 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1219 06:12:14.907362 2064791 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1219 06:12:15.033676 2064791 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1219 06:12:15.155919 2064791 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1219 06:12:15.169590 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1219 06:12:15.184917 2064791 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1219 06:12:15.194691 2064791 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1219 06:12:15.203742 2064791 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1219 06:12:15.203801 2064791 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1219 06:12:15.212945 2064791 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1219 06:12:15.221903 2064791 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1219 06:12:15.231019 2064791 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1219 06:12:15.239988 2064791 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1219 06:12:15.248292 2064791 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1219 06:12:15.257554 2064791 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1219 06:12:15.266460 2064791 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1219 06:12:15.275351 2064791 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1219 06:12:15.282864 2064791 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1219 06:12:15.290662 2064791 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1219 06:12:15.400462 2064791 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1219 06:12:15.544853 2064791 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1219 06:12:15.544914 2064791 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1219 06:12:15.549076 2064791 start.go:564] Will wait 60s for crictl version
	I1219 06:12:15.549132 2064791 ssh_runner.go:195] Run: which crictl
	I1219 06:12:15.552855 2064791 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1219 06:12:15.578380 2064791 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.2.0
	RuntimeApiVersion:  v1
	I1219 06:12:15.578461 2064791 ssh_runner.go:195] Run: containerd --version
	I1219 06:12:15.600920 2064791 ssh_runner.go:195] Run: containerd --version
	I1219 06:12:15.626436 2064791 out.go:179] * Preparing Kubernetes v1.35.0-rc.1 on containerd 2.2.0 ...
	I1219 06:12:15.629308 2064791 cli_runner.go:164] Run: docker network inspect functional-006924 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1219 06:12:15.645624 2064791 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1219 06:12:15.652379 2064791 out.go:179]   - apiserver.enable-admission-plugins=NamespaceAutoProvision
	I1219 06:12:15.655147 2064791 kubeadm.go:884] updating cluster {Name:functional-006924 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APIServerName:minikubeC
A APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMi
rror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1219 06:12:15.655272 2064791 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1219 06:12:15.655368 2064791 ssh_runner.go:195] Run: sudo crictl images --output json
	I1219 06:12:15.679674 2064791 containerd.go:627] all images are preloaded for containerd runtime.
	I1219 06:12:15.679686 2064791 containerd.go:534] Images already preloaded, skipping extraction
	I1219 06:12:15.679751 2064791 ssh_runner.go:195] Run: sudo crictl images --output json
	I1219 06:12:15.704545 2064791 containerd.go:627] all images are preloaded for containerd runtime.
	I1219 06:12:15.704557 2064791 cache_images.go:86] Images are preloaded, skipping loading
	I1219 06:12:15.704563 2064791 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-rc.1 containerd true true} ...
	I1219 06:12:15.704666 2064791 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-rc.1/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-006924 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1219 06:12:15.704733 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s info
	I1219 06:12:15.729671 2064791 extraconfig.go:125] Overwriting default enable-admission-plugins=NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota with user provided enable-admission-plugins=NamespaceAutoProvision for component apiserver
	I1219 06:12:15.729690 2064791 cni.go:84] Creating CNI manager for ""
	I1219 06:12:15.729697 2064791 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1219 06:12:15.729711 2064791 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1219 06:12:15.729738 2064791 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-rc.1 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-006924 NodeName:functional-006924 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceAutoProvision] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false Kubelet
ConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1219 06:12:15.729853 2064791 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-006924"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceAutoProvision"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-rc.1
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1219 06:12:15.729919 2064791 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-rc.1
	I1219 06:12:15.737786 2064791 binaries.go:51] Found k8s binaries, skipping transfer
	I1219 06:12:15.737845 2064791 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1219 06:12:15.745456 2064791 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (326 bytes)
	I1219 06:12:15.758378 2064791 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (357 bytes)
	I1219 06:12:15.775454 2064791 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2085 bytes)
	I1219 06:12:15.788878 2064791 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1219 06:12:15.792954 2064791 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1219 06:12:15.901526 2064791 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1219 06:12:16.150661 2064791 certs.go:69] Setting up /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924 for IP: 192.168.49.2
	I1219 06:12:16.150673 2064791 certs.go:195] generating shared ca certs ...
	I1219 06:12:16.150687 2064791 certs.go:227] acquiring lock for ca certs: {Name:mk382c71693ea4061363f97b153b21bf6cdf5f38 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1219 06:12:16.150828 2064791 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.key
	I1219 06:12:16.150868 2064791 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/proxy-client-ca.key
	I1219 06:12:16.150873 2064791 certs.go:257] generating profile certs ...
	I1219 06:12:16.150961 2064791 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.key
	I1219 06:12:16.151009 2064791 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.key.febe6fed
	I1219 06:12:16.151048 2064791 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/proxy-client.key
	I1219 06:12:16.151165 2064791 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/2000386.pem (1338 bytes)
	W1219 06:12:16.151195 2064791 certs.go:480] ignoring /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/2000386_empty.pem, impossibly tiny 0 bytes
	I1219 06:12:16.151202 2064791 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca-key.pem (1679 bytes)
	I1219 06:12:16.151230 2064791 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem (1078 bytes)
	I1219 06:12:16.151264 2064791 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/cert.pem (1123 bytes)
	I1219 06:12:16.151286 2064791 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/key.pem (1671 bytes)
	I1219 06:12:16.151329 2064791 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem (1708 bytes)
	I1219 06:12:16.151962 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1219 06:12:16.174202 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1219 06:12:16.194590 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1219 06:12:16.215085 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I1219 06:12:16.232627 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1219 06:12:16.250371 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I1219 06:12:16.267689 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1219 06:12:16.285522 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I1219 06:12:16.302837 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem --> /usr/share/ca-certificates/20003862.pem (1708 bytes)
	I1219 06:12:16.320411 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1219 06:12:16.337922 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/2000386.pem --> /usr/share/ca-certificates/2000386.pem (1338 bytes)
	I1219 06:12:16.355077 2064791 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (722 bytes)
	I1219 06:12:16.368122 2064791 ssh_runner.go:195] Run: openssl version
	I1219 06:12:16.374305 2064791 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:12:16.381720 2064791 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1219 06:12:16.389786 2064791 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:12:16.393456 2064791 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 19 05:43 /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:12:16.393514 2064791 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:12:16.434859 2064791 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1219 06:12:16.442942 2064791 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/2000386.pem
	I1219 06:12:16.450665 2064791 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/2000386.pem /etc/ssl/certs/2000386.pem
	I1219 06:12:16.458612 2064791 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/2000386.pem
	I1219 06:12:16.462545 2064791 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 19 05:57 /usr/share/ca-certificates/2000386.pem
	I1219 06:12:16.462603 2064791 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/2000386.pem
	I1219 06:12:16.503732 2064791 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1219 06:12:16.511394 2064791 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/20003862.pem
	I1219 06:12:16.519328 2064791 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/20003862.pem /etc/ssl/certs/20003862.pem
	I1219 06:12:16.526844 2064791 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/20003862.pem
	I1219 06:12:16.530487 2064791 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 19 05:57 /usr/share/ca-certificates/20003862.pem
	I1219 06:12:16.530547 2064791 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/20003862.pem
	I1219 06:12:16.571532 2064791 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1219 06:12:16.579524 2064791 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1219 06:12:16.583470 2064791 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1219 06:12:16.624483 2064791 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1219 06:12:16.665575 2064791 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1219 06:12:16.707109 2064791 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1219 06:12:16.749520 2064791 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1219 06:12:16.790988 2064791 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1219 06:12:16.831921 2064791 kubeadm.go:401] StartCluster: {Name:functional-006924 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APIServerName:minikubeCA A
PIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirro
r: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1219 06:12:16.832006 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1219 06:12:16.832084 2064791 ssh_runner.go:195] Run: sudo -s eval "crictl --timeout=10s ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1219 06:12:16.857771 2064791 cri.go:92] found id: ""
	I1219 06:12:16.857833 2064791 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1219 06:12:16.866091 2064791 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1219 06:12:16.866101 2064791 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1219 06:12:16.866158 2064791 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1219 06:12:16.873926 2064791 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1219 06:12:16.874482 2064791 kubeconfig.go:125] found "functional-006924" server: "https://192.168.49.2:8441"
	I1219 06:12:16.875731 2064791 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1219 06:12:16.883987 2064791 kubeadm.go:645] detected kubeadm config drift (will reconfigure cluster from new /var/tmp/minikube/kubeadm.yaml):
	-- stdout --
	--- /var/tmp/minikube/kubeadm.yaml	2025-12-19 05:57:41.594715365 +0000
	+++ /var/tmp/minikube/kubeadm.yaml.new	2025-12-19 06:12:15.784216685 +0000
	@@ -24,7 +24,7 @@
	   certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	   extraArgs:
	     - name: "enable-admission-plugins"
	-      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	+      value: "NamespaceAutoProvision"
	 controllerManager:
	   extraArgs:
	     - name: "allocate-node-cidrs"
	
	-- /stdout --
	I1219 06:12:16.884007 2064791 kubeadm.go:1161] stopping kube-system containers ...
	I1219 06:12:16.884018 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name: Namespaces:[kube-system]}
	I1219 06:12:16.884079 2064791 ssh_runner.go:195] Run: sudo -s eval "crictl --timeout=10s ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1219 06:12:16.914439 2064791 cri.go:92] found id: ""
	I1219 06:12:16.914509 2064791 ssh_runner.go:195] Run: sudo systemctl stop kubelet
	I1219 06:12:16.934128 2064791 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1219 06:12:16.942432 2064791 kubeadm.go:158] found existing configuration files:
	-rw------- 1 root root 5631 Dec 19 06:01 /etc/kubernetes/admin.conf
	-rw------- 1 root root 5640 Dec 19 06:01 /etc/kubernetes/controller-manager.conf
	-rw------- 1 root root 5672 Dec 19 06:01 /etc/kubernetes/kubelet.conf
	-rw------- 1 root root 5584 Dec 19 06:01 /etc/kubernetes/scheduler.conf
	
	I1219 06:12:16.942490 2064791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1219 06:12:16.950312 2064791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1219 06:12:16.957901 2064791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1219 06:12:16.957957 2064791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1219 06:12:16.965831 2064791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1219 06:12:16.973975 2064791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1219 06:12:16.974043 2064791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1219 06:12:16.981885 2064791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1219 06:12:16.989698 2064791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1219 06:12:16.989754 2064791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1219 06:12:16.997294 2064791 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1219 06:12:17.007519 2064791 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase certs all --config /var/tmp/minikube/kubeadm.yaml"
	I1219 06:12:17.060607 2064791 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml"
	I1219 06:12:18.829242 2064791 ssh_runner.go:235] Completed: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml": (1.768608779s)
	I1219 06:12:18.829304 2064791 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase kubelet-start --config /var/tmp/minikube/kubeadm.yaml"
	I1219 06:12:19.030093 2064791 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase control-plane all --config /var/tmp/minikube/kubeadm.yaml"
	I1219 06:12:19.096673 2064791 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase etcd local --config /var/tmp/minikube/kubeadm.yaml"
	I1219 06:12:19.143573 2064791 api_server.go:52] waiting for apiserver process to appear ...
	I1219 06:12:19.143640 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:19.643853 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:20.143947 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:20.643846 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:21.143937 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:21.644473 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:22.143865 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:22.643833 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:23.143826 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:23.644236 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:24.144477 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:24.643853 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:25.144064 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:25.643843 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:26.144063 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:26.644478 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:27.144296 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:27.644722 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:28.143844 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:28.643941 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:29.144786 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:29.644723 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:30.143963 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:30.644625 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:31.144751 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:31.643964 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:32.143826 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:32.644605 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:33.144436 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:33.644603 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:34.144557 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:34.643903 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:35.143857 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:35.644797 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:36.144741 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:36.644680 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:37.143872 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:37.643878 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:38.143792 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:38.643830 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:39.143968 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:39.644577 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:40.144282 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:40.643845 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:41.144575 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:41.644658 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:42.144382 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:42.643720 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:43.144137 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:43.644655 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:44.144500 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:44.643786 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:45.143923 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:45.643858 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:46.143983 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:46.644774 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:47.143914 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:47.644188 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:48.144565 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:48.644497 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:49.144557 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:49.644207 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:50.144291 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:50.644010 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:51.144161 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:51.644181 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:52.144353 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:52.644199 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:53.144589 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:53.644816 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:54.143901 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:54.643842 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:55.144726 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:55.643993 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:56.143828 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:56.644199 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:57.144581 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:57.644119 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:58.144258 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:58.643843 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:59.144738 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:59.644593 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:00.143794 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:00.643826 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:01.144533 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:01.644697 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:02.144643 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:02.644679 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:03.143834 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:03.644373 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:04.144520 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:04.643962 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:05.143832 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:05.644579 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:06.143874 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:06.643870 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:07.143982 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:07.644598 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:08.144752 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:08.643796 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:09.143951 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:09.644657 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:10.143751 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:10.643986 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:11.143782 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:11.644460 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:12.144317 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:12.644346 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:13.144670 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:13.643862 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:14.144550 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:14.644576 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:15.144673 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:15.644083 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:16.144204 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:16.644063 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:17.144669 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:17.643808 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:18.144068 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:18.643722 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:19.143899 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:19.143976 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:19.173119 2064791 cri.go:92] found id: ""
	I1219 06:13:19.173133 2064791 logs.go:282] 0 containers: []
	W1219 06:13:19.173141 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:19.173146 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:19.173204 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:19.207794 2064791 cri.go:92] found id: ""
	I1219 06:13:19.207807 2064791 logs.go:282] 0 containers: []
	W1219 06:13:19.207814 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:19.207819 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:19.207884 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:19.237060 2064791 cri.go:92] found id: ""
	I1219 06:13:19.237074 2064791 logs.go:282] 0 containers: []
	W1219 06:13:19.237081 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:19.237092 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:19.237154 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:19.262099 2064791 cri.go:92] found id: ""
	I1219 06:13:19.262114 2064791 logs.go:282] 0 containers: []
	W1219 06:13:19.262121 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:19.262126 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:19.262185 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:19.287540 2064791 cri.go:92] found id: ""
	I1219 06:13:19.287554 2064791 logs.go:282] 0 containers: []
	W1219 06:13:19.287561 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:19.287566 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:19.287632 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:19.315088 2064791 cri.go:92] found id: ""
	I1219 06:13:19.315102 2064791 logs.go:282] 0 containers: []
	W1219 06:13:19.315109 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:19.315115 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:19.315176 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:19.340777 2064791 cri.go:92] found id: ""
	I1219 06:13:19.340791 2064791 logs.go:282] 0 containers: []
	W1219 06:13:19.340798 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:19.340806 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:19.340818 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:19.357916 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:19.357932 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:19.426302 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:19.417866   10752 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:19.418382   10752 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:19.420072   10752 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:19.420408   10752 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:19.421854   10752 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:19.417866   10752 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:19.418382   10752 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:19.420072   10752 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:19.420408   10752 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:19.421854   10752 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:19.426313 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:19.426323 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:19.488347 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:19.488367 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:19.520211 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:19.520229 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:22.084930 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:22.095535 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:22.095602 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:22.122011 2064791 cri.go:92] found id: ""
	I1219 06:13:22.122025 2064791 logs.go:282] 0 containers: []
	W1219 06:13:22.122034 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:22.122059 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:22.122131 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:22.146880 2064791 cri.go:92] found id: ""
	I1219 06:13:22.146893 2064791 logs.go:282] 0 containers: []
	W1219 06:13:22.146900 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:22.146905 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:22.146975 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:22.176007 2064791 cri.go:92] found id: ""
	I1219 06:13:22.176021 2064791 logs.go:282] 0 containers: []
	W1219 06:13:22.176028 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:22.176033 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:22.176095 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:22.211343 2064791 cri.go:92] found id: ""
	I1219 06:13:22.211357 2064791 logs.go:282] 0 containers: []
	W1219 06:13:22.211365 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:22.211370 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:22.211429 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:22.235806 2064791 cri.go:92] found id: ""
	I1219 06:13:22.235829 2064791 logs.go:282] 0 containers: []
	W1219 06:13:22.235836 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:22.235841 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:22.235910 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:22.260858 2064791 cri.go:92] found id: ""
	I1219 06:13:22.260882 2064791 logs.go:282] 0 containers: []
	W1219 06:13:22.260888 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:22.260894 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:22.260954 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:22.285583 2064791 cri.go:92] found id: ""
	I1219 06:13:22.285597 2064791 logs.go:282] 0 containers: []
	W1219 06:13:22.285604 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:22.285613 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:22.285624 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:22.302970 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:22.302988 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:22.371208 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:22.362378   10854 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:22.363321   10854 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:22.365234   10854 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:22.365718   10854 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:22.367191   10854 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:22.362378   10854 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:22.363321   10854 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:22.365234   10854 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:22.365718   10854 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:22.367191   10854 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:22.371227 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:22.371238 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:22.433354 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:22.433373 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:22.468288 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:22.468305 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:25.028097 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:25.038266 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:25.038327 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:25.066109 2064791 cri.go:92] found id: ""
	I1219 06:13:25.066123 2064791 logs.go:282] 0 containers: []
	W1219 06:13:25.066130 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:25.066136 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:25.066199 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:25.091083 2064791 cri.go:92] found id: ""
	I1219 06:13:25.091096 2064791 logs.go:282] 0 containers: []
	W1219 06:13:25.091103 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:25.091109 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:25.091175 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:25.116729 2064791 cri.go:92] found id: ""
	I1219 06:13:25.116743 2064791 logs.go:282] 0 containers: []
	W1219 06:13:25.116750 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:25.116808 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:25.116890 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:25.145471 2064791 cri.go:92] found id: ""
	I1219 06:13:25.145485 2064791 logs.go:282] 0 containers: []
	W1219 06:13:25.145492 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:25.145497 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:25.145555 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:25.173780 2064791 cri.go:92] found id: ""
	I1219 06:13:25.173795 2064791 logs.go:282] 0 containers: []
	W1219 06:13:25.173801 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:25.173807 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:25.173876 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:25.202994 2064791 cri.go:92] found id: ""
	I1219 06:13:25.203008 2064791 logs.go:282] 0 containers: []
	W1219 06:13:25.203015 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:25.203021 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:25.203082 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:25.228548 2064791 cri.go:92] found id: ""
	I1219 06:13:25.228563 2064791 logs.go:282] 0 containers: []
	W1219 06:13:25.228570 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:25.228578 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:25.228590 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:25.260074 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:25.260090 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:25.316293 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:25.316311 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:25.333755 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:25.333771 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:25.395261 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:25.387061   10971 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:25.387481   10971 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:25.389091   10971 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:25.389803   10971 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:25.391447   10971 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:25.387061   10971 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:25.387481   10971 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:25.389091   10971 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:25.389803   10971 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:25.391447   10971 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:25.395273 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:25.395290 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:27.958003 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:27.968507 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:27.968571 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:27.994859 2064791 cri.go:92] found id: ""
	I1219 06:13:27.994872 2064791 logs.go:282] 0 containers: []
	W1219 06:13:27.994879 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:27.994884 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:27.994942 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:28.023716 2064791 cri.go:92] found id: ""
	I1219 06:13:28.023729 2064791 logs.go:282] 0 containers: []
	W1219 06:13:28.023736 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:28.023741 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:28.023807 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:28.048490 2064791 cri.go:92] found id: ""
	I1219 06:13:28.048504 2064791 logs.go:282] 0 containers: []
	W1219 06:13:28.048512 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:28.048517 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:28.048575 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:28.074305 2064791 cri.go:92] found id: ""
	I1219 06:13:28.074319 2064791 logs.go:282] 0 containers: []
	W1219 06:13:28.074326 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:28.074332 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:28.074392 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:28.098924 2064791 cri.go:92] found id: ""
	I1219 06:13:28.098938 2064791 logs.go:282] 0 containers: []
	W1219 06:13:28.098945 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:28.098950 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:28.099021 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:28.123000 2064791 cri.go:92] found id: ""
	I1219 06:13:28.123013 2064791 logs.go:282] 0 containers: []
	W1219 06:13:28.123021 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:28.123026 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:28.123091 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:28.150415 2064791 cri.go:92] found id: ""
	I1219 06:13:28.150428 2064791 logs.go:282] 0 containers: []
	W1219 06:13:28.150435 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:28.150443 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:28.150453 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:28.210763 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:28.210782 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:28.230191 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:28.230208 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:28.294389 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:28.286078   11066 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:28.286664   11066 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:28.288135   11066 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:28.288528   11066 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:28.290322   11066 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:28.286078   11066 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:28.286664   11066 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:28.288135   11066 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:28.288528   11066 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:28.290322   11066 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:28.294400 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:28.294411 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:28.357351 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:28.357371 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:30.888172 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:30.898614 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:30.898676 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:30.926377 2064791 cri.go:92] found id: ""
	I1219 06:13:30.926391 2064791 logs.go:282] 0 containers: []
	W1219 06:13:30.926398 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:30.926403 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:30.926458 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:30.950084 2064791 cri.go:92] found id: ""
	I1219 06:13:30.950097 2064791 logs.go:282] 0 containers: []
	W1219 06:13:30.950111 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:30.950117 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:30.950180 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:30.975713 2064791 cri.go:92] found id: ""
	I1219 06:13:30.975726 2064791 logs.go:282] 0 containers: []
	W1219 06:13:30.975734 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:30.975740 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:30.975798 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:31.012698 2064791 cri.go:92] found id: ""
	I1219 06:13:31.012712 2064791 logs.go:282] 0 containers: []
	W1219 06:13:31.012719 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:31.012725 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:31.012833 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:31.036945 2064791 cri.go:92] found id: ""
	I1219 06:13:31.036958 2064791 logs.go:282] 0 containers: []
	W1219 06:13:31.036965 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:31.036970 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:31.037028 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:31.062431 2064791 cri.go:92] found id: ""
	I1219 06:13:31.062445 2064791 logs.go:282] 0 containers: []
	W1219 06:13:31.062452 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:31.062457 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:31.062538 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:31.088075 2064791 cri.go:92] found id: ""
	I1219 06:13:31.088099 2064791 logs.go:282] 0 containers: []
	W1219 06:13:31.088106 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:31.088114 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:31.088123 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:31.143908 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:31.143928 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:31.164642 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:31.164661 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:31.241367 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:31.232734   11174 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:31.233582   11174 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:31.235302   11174 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:31.235873   11174 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:31.237342   11174 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:31.232734   11174 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:31.233582   11174 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:31.235302   11174 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:31.235873   11174 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:31.237342   11174 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:31.241378 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:31.241388 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:31.304583 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:31.304602 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:33.835874 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:33.847289 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:33.847350 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:33.874497 2064791 cri.go:92] found id: ""
	I1219 06:13:33.874511 2064791 logs.go:282] 0 containers: []
	W1219 06:13:33.874518 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:33.874523 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:33.874602 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:33.899113 2064791 cri.go:92] found id: ""
	I1219 06:13:33.899127 2064791 logs.go:282] 0 containers: []
	W1219 06:13:33.899134 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:33.899139 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:33.899198 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:33.927533 2064791 cri.go:92] found id: ""
	I1219 06:13:33.927546 2064791 logs.go:282] 0 containers: []
	W1219 06:13:33.927553 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:33.927559 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:33.927616 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:33.955150 2064791 cri.go:92] found id: ""
	I1219 06:13:33.955163 2064791 logs.go:282] 0 containers: []
	W1219 06:13:33.955170 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:33.955176 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:33.955233 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:33.979739 2064791 cri.go:92] found id: ""
	I1219 06:13:33.979753 2064791 logs.go:282] 0 containers: []
	W1219 06:13:33.979760 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:33.979765 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:33.979824 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:34.005264 2064791 cri.go:92] found id: ""
	I1219 06:13:34.005283 2064791 logs.go:282] 0 containers: []
	W1219 06:13:34.005291 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:34.005298 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:34.005375 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:34.031917 2064791 cri.go:92] found id: ""
	I1219 06:13:34.031931 2064791 logs.go:282] 0 containers: []
	W1219 06:13:34.031949 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:34.031958 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:34.031968 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:34.098907 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:34.098938 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:34.117494 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:34.117513 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:34.190606 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:34.181776   11272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:34.182594   11272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:34.184322   11272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:34.184963   11272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:34.186654   11272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:34.181776   11272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:34.182594   11272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:34.184322   11272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:34.184963   11272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:34.186654   11272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:34.190617 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:34.190630 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:34.260586 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:34.260607 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:36.792986 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:36.803226 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:36.803292 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:36.830943 2064791 cri.go:92] found id: ""
	I1219 06:13:36.830957 2064791 logs.go:282] 0 containers: []
	W1219 06:13:36.830964 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:36.830970 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:36.831029 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:36.856036 2064791 cri.go:92] found id: ""
	I1219 06:13:36.856051 2064791 logs.go:282] 0 containers: []
	W1219 06:13:36.856058 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:36.856063 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:36.856133 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:36.880807 2064791 cri.go:92] found id: ""
	I1219 06:13:36.880821 2064791 logs.go:282] 0 containers: []
	W1219 06:13:36.880828 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:36.880834 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:36.880893 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:36.904515 2064791 cri.go:92] found id: ""
	I1219 06:13:36.904529 2064791 logs.go:282] 0 containers: []
	W1219 06:13:36.904536 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:36.904542 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:36.904601 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:36.929517 2064791 cri.go:92] found id: ""
	I1219 06:13:36.929530 2064791 logs.go:282] 0 containers: []
	W1219 06:13:36.929538 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:36.929543 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:36.929615 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:36.953623 2064791 cri.go:92] found id: ""
	I1219 06:13:36.953636 2064791 logs.go:282] 0 containers: []
	W1219 06:13:36.953644 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:36.953650 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:36.953706 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:36.978769 2064791 cri.go:92] found id: ""
	I1219 06:13:36.978783 2064791 logs.go:282] 0 containers: []
	W1219 06:13:36.978790 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:36.978797 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:36.978807 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:37.036051 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:37.036072 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:37.053881 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:37.053898 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:37.117512 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:37.109460   11377 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:37.110050   11377 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:37.111554   11377 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:37.112021   11377 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:37.113512   11377 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:37.109460   11377 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:37.110050   11377 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:37.111554   11377 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:37.112021   11377 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:37.113512   11377 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:37.117523 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:37.117532 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:37.185580 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:37.185599 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:39.724185 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:39.735602 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:39.735670 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:39.760200 2064791 cri.go:92] found id: ""
	I1219 06:13:39.760214 2064791 logs.go:282] 0 containers: []
	W1219 06:13:39.760222 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:39.760227 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:39.760286 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:39.787416 2064791 cri.go:92] found id: ""
	I1219 06:13:39.787429 2064791 logs.go:282] 0 containers: []
	W1219 06:13:39.787437 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:39.787442 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:39.787505 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:39.811808 2064791 cri.go:92] found id: ""
	I1219 06:13:39.811822 2064791 logs.go:282] 0 containers: []
	W1219 06:13:39.811830 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:39.811836 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:39.811902 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:39.837072 2064791 cri.go:92] found id: ""
	I1219 06:13:39.837086 2064791 logs.go:282] 0 containers: []
	W1219 06:13:39.837093 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:39.837099 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:39.837200 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:39.866418 2064791 cri.go:92] found id: ""
	I1219 06:13:39.866432 2064791 logs.go:282] 0 containers: []
	W1219 06:13:39.866438 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:39.866444 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:39.866502 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:39.894744 2064791 cri.go:92] found id: ""
	I1219 06:13:39.894758 2064791 logs.go:282] 0 containers: []
	W1219 06:13:39.894765 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:39.894770 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:39.894833 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:39.921608 2064791 cri.go:92] found id: ""
	I1219 06:13:39.921622 2064791 logs.go:282] 0 containers: []
	W1219 06:13:39.921629 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:39.921643 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:39.921654 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:39.985200 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:39.985220 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:40.004064 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:40.004091 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:40.077619 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:40.069025   11483 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:40.069761   11483 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:40.071391   11483 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:40.072228   11483 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:40.073414   11483 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:40.069025   11483 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:40.069761   11483 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:40.071391   11483 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:40.072228   11483 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:40.073414   11483 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:40.077631 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:40.077641 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:40.142102 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:40.142127 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:42.682372 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:42.692608 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:42.692675 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:42.716749 2064791 cri.go:92] found id: ""
	I1219 06:13:42.716796 2064791 logs.go:282] 0 containers: []
	W1219 06:13:42.716804 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:42.716809 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:42.716888 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:42.740973 2064791 cri.go:92] found id: ""
	I1219 06:13:42.740986 2064791 logs.go:282] 0 containers: []
	W1219 06:13:42.740993 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:42.740999 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:42.741064 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:42.765521 2064791 cri.go:92] found id: ""
	I1219 06:13:42.765535 2064791 logs.go:282] 0 containers: []
	W1219 06:13:42.765543 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:42.765548 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:42.765607 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:42.790000 2064791 cri.go:92] found id: ""
	I1219 06:13:42.790015 2064791 logs.go:282] 0 containers: []
	W1219 06:13:42.790034 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:42.790040 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:42.790107 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:42.813722 2064791 cri.go:92] found id: ""
	I1219 06:13:42.813736 2064791 logs.go:282] 0 containers: []
	W1219 06:13:42.813743 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:42.813752 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:42.813814 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:42.838912 2064791 cri.go:92] found id: ""
	I1219 06:13:42.838926 2064791 logs.go:282] 0 containers: []
	W1219 06:13:42.838934 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:42.838939 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:42.839002 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:42.867044 2064791 cri.go:92] found id: ""
	I1219 06:13:42.867058 2064791 logs.go:282] 0 containers: []
	W1219 06:13:42.867065 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:42.867073 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:42.867083 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:42.923612 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:42.923632 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:42.941274 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:42.941293 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:43.008705 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:42.998396   11587 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:42.999004   11587 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:43.000468   11587 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:43.000919   11587 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:43.003777   11587 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:42.998396   11587 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:42.999004   11587 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:43.000468   11587 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:43.000919   11587 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:43.003777   11587 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:43.008716 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:43.008736 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:43.074629 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:43.074654 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:45.608725 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:45.619043 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:45.619107 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:45.645025 2064791 cri.go:92] found id: ""
	I1219 06:13:45.645041 2064791 logs.go:282] 0 containers: []
	W1219 06:13:45.645049 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:45.645054 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:45.645120 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:45.671700 2064791 cri.go:92] found id: ""
	I1219 06:13:45.671716 2064791 logs.go:282] 0 containers: []
	W1219 06:13:45.671723 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:45.671735 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:45.671797 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:45.701839 2064791 cri.go:92] found id: ""
	I1219 06:13:45.701864 2064791 logs.go:282] 0 containers: []
	W1219 06:13:45.701872 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:45.701878 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:45.701947 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:45.731819 2064791 cri.go:92] found id: ""
	I1219 06:13:45.731834 2064791 logs.go:282] 0 containers: []
	W1219 06:13:45.731841 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:45.731847 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:45.731910 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:45.758372 2064791 cri.go:92] found id: ""
	I1219 06:13:45.758386 2064791 logs.go:282] 0 containers: []
	W1219 06:13:45.758393 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:45.758399 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:45.758464 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:45.784713 2064791 cri.go:92] found id: ""
	I1219 06:13:45.784727 2064791 logs.go:282] 0 containers: []
	W1219 06:13:45.784734 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:45.784739 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:45.784829 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:45.811948 2064791 cri.go:92] found id: ""
	I1219 06:13:45.811962 2064791 logs.go:282] 0 containers: []
	W1219 06:13:45.811969 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:45.811977 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:45.811987 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:45.868299 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:45.868317 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:45.886032 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:45.886049 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:45.952733 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:45.944285   11690 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:45.944957   11690 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:45.946453   11690 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:45.946859   11690 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:45.948306   11690 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:45.944285   11690 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:45.944957   11690 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:45.946453   11690 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:45.946859   11690 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:45.948306   11690 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:45.952743 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:45.952783 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:46.020565 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:46.020588 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:48.550865 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:48.561408 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:48.561483 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:48.586776 2064791 cri.go:92] found id: ""
	I1219 06:13:48.586790 2064791 logs.go:282] 0 containers: []
	W1219 06:13:48.586797 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:48.586802 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:48.586864 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:48.612701 2064791 cri.go:92] found id: ""
	I1219 06:13:48.612715 2064791 logs.go:282] 0 containers: []
	W1219 06:13:48.612722 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:48.612727 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:48.612808 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:48.637097 2064791 cri.go:92] found id: ""
	I1219 06:13:48.637110 2064791 logs.go:282] 0 containers: []
	W1219 06:13:48.637118 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:48.637124 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:48.637183 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:48.662701 2064791 cri.go:92] found id: ""
	I1219 06:13:48.662715 2064791 logs.go:282] 0 containers: []
	W1219 06:13:48.662722 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:48.662727 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:48.662785 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:48.690291 2064791 cri.go:92] found id: ""
	I1219 06:13:48.690304 2064791 logs.go:282] 0 containers: []
	W1219 06:13:48.690311 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:48.690316 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:48.690376 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:48.715968 2064791 cri.go:92] found id: ""
	I1219 06:13:48.715983 2064791 logs.go:282] 0 containers: []
	W1219 06:13:48.715990 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:48.715995 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:48.716059 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:48.741069 2064791 cri.go:92] found id: ""
	I1219 06:13:48.741082 2064791 logs.go:282] 0 containers: []
	W1219 06:13:48.741090 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:48.741097 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:48.741113 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:48.796842 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:48.796863 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:48.814146 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:48.814166 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:48.879995 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:48.871133   11794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:48.871771   11794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:48.873597   11794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:48.874180   11794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:48.875997   11794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:48.871133   11794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:48.871771   11794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:48.873597   11794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:48.874180   11794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:48.875997   11794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:48.880005 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:48.880017 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:48.943211 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:48.943231 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:51.472961 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:51.483727 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:51.483805 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:51.513333 2064791 cri.go:92] found id: ""
	I1219 06:13:51.513347 2064791 logs.go:282] 0 containers: []
	W1219 06:13:51.513354 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:51.513360 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:51.513426 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:51.539359 2064791 cri.go:92] found id: ""
	I1219 06:13:51.539373 2064791 logs.go:282] 0 containers: []
	W1219 06:13:51.539380 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:51.539392 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:51.539449 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:51.564730 2064791 cri.go:92] found id: ""
	I1219 06:13:51.564743 2064791 logs.go:282] 0 containers: []
	W1219 06:13:51.564750 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:51.564794 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:51.564855 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:51.590117 2064791 cri.go:92] found id: ""
	I1219 06:13:51.590138 2064791 logs.go:282] 0 containers: []
	W1219 06:13:51.590145 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:51.590150 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:51.590210 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:51.614688 2064791 cri.go:92] found id: ""
	I1219 06:13:51.614702 2064791 logs.go:282] 0 containers: []
	W1219 06:13:51.614709 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:51.614715 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:51.614778 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:51.638492 2064791 cri.go:92] found id: ""
	I1219 06:13:51.638508 2064791 logs.go:282] 0 containers: []
	W1219 06:13:51.638518 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:51.638524 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:51.638597 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:51.666861 2064791 cri.go:92] found id: ""
	I1219 06:13:51.666874 2064791 logs.go:282] 0 containers: []
	W1219 06:13:51.666881 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:51.666888 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:51.666899 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:51.731208 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:51.723294   11895 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:51.723881   11895 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:51.725543   11895 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:51.725958   11895 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:51.727478   11895 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:51.723294   11895 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:51.723881   11895 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:51.725543   11895 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:51.725958   11895 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:51.727478   11895 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:51.731218 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:51.731228 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:51.793354 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:51.793375 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:51.819761 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:51.819784 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:51.877976 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:51.877996 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:54.395396 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:54.405788 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:54.405848 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:54.444121 2064791 cri.go:92] found id: ""
	I1219 06:13:54.444151 2064791 logs.go:282] 0 containers: []
	W1219 06:13:54.444159 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:54.444164 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:54.444243 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:54.471038 2064791 cri.go:92] found id: ""
	I1219 06:13:54.471064 2064791 logs.go:282] 0 containers: []
	W1219 06:13:54.471072 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:54.471077 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:54.471160 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:54.500364 2064791 cri.go:92] found id: ""
	I1219 06:13:54.500377 2064791 logs.go:282] 0 containers: []
	W1219 06:13:54.500385 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:54.500390 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:54.500450 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:54.525919 2064791 cri.go:92] found id: ""
	I1219 06:13:54.525934 2064791 logs.go:282] 0 containers: []
	W1219 06:13:54.525941 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:54.525962 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:54.526021 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:54.551211 2064791 cri.go:92] found id: ""
	I1219 06:13:54.551225 2064791 logs.go:282] 0 containers: []
	W1219 06:13:54.551232 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:54.551239 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:54.551310 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:54.577841 2064791 cri.go:92] found id: ""
	I1219 06:13:54.577854 2064791 logs.go:282] 0 containers: []
	W1219 06:13:54.577861 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:54.577866 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:54.577931 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:54.602636 2064791 cri.go:92] found id: ""
	I1219 06:13:54.602650 2064791 logs.go:282] 0 containers: []
	W1219 06:13:54.602656 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:54.602664 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:54.602675 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:54.619644 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:54.619661 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:54.682901 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:54.674199   12005 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:54.675104   12005 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:54.676718   12005 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:54.677329   12005 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:54.678988   12005 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:54.674199   12005 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:54.675104   12005 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:54.676718   12005 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:54.677329   12005 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:54.678988   12005 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:54.682911 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:54.682921 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:54.749370 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:54.749393 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:54.780731 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:54.780747 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:57.338712 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:57.349237 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:57.349299 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:57.378161 2064791 cri.go:92] found id: ""
	I1219 06:13:57.378175 2064791 logs.go:282] 0 containers: []
	W1219 06:13:57.378181 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:57.378187 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:57.378247 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:57.403073 2064791 cri.go:92] found id: ""
	I1219 06:13:57.403087 2064791 logs.go:282] 0 containers: []
	W1219 06:13:57.403094 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:57.403099 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:57.403160 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:57.431222 2064791 cri.go:92] found id: ""
	I1219 06:13:57.431236 2064791 logs.go:282] 0 containers: []
	W1219 06:13:57.431244 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:57.431249 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:57.431306 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:57.466943 2064791 cri.go:92] found id: ""
	I1219 06:13:57.466957 2064791 logs.go:282] 0 containers: []
	W1219 06:13:57.466964 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:57.466969 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:57.467027 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:57.493181 2064791 cri.go:92] found id: ""
	I1219 06:13:57.493194 2064791 logs.go:282] 0 containers: []
	W1219 06:13:57.493201 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:57.493206 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:57.493265 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:57.517521 2064791 cri.go:92] found id: ""
	I1219 06:13:57.517534 2064791 logs.go:282] 0 containers: []
	W1219 06:13:57.517543 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:57.517549 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:57.517606 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:57.546827 2064791 cri.go:92] found id: ""
	I1219 06:13:57.546841 2064791 logs.go:282] 0 containers: []
	W1219 06:13:57.546848 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:57.546856 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:57.546865 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:57.603521 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:57.603540 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:57.620971 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:57.620988 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:57.687316 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:57.678838   12107 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:57.679404   12107 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:57.680987   12107 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:57.681452   12107 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:57.683006   12107 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:57.678838   12107 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:57.679404   12107 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:57.680987   12107 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:57.681452   12107 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:57.683006   12107 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:57.687326 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:57.687336 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:57.759758 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:57.759787 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:00.293478 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:00.313120 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:00.313205 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:00.349921 2064791 cri.go:92] found id: ""
	I1219 06:14:00.349938 2064791 logs.go:282] 0 containers: []
	W1219 06:14:00.349947 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:00.349953 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:00.350031 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:00.381005 2064791 cri.go:92] found id: ""
	I1219 06:14:00.381022 2064791 logs.go:282] 0 containers: []
	W1219 06:14:00.381031 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:00.381037 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:00.381113 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:00.415179 2064791 cri.go:92] found id: ""
	I1219 06:14:00.415194 2064791 logs.go:282] 0 containers: []
	W1219 06:14:00.415202 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:00.415207 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:00.415268 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:00.455068 2064791 cri.go:92] found id: ""
	I1219 06:14:00.455084 2064791 logs.go:282] 0 containers: []
	W1219 06:14:00.455090 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:00.455096 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:00.455170 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:00.488360 2064791 cri.go:92] found id: ""
	I1219 06:14:00.488374 2064791 logs.go:282] 0 containers: []
	W1219 06:14:00.488382 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:00.488387 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:00.488450 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:00.514399 2064791 cri.go:92] found id: ""
	I1219 06:14:00.514414 2064791 logs.go:282] 0 containers: []
	W1219 06:14:00.514420 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:00.514426 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:00.514485 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:00.544639 2064791 cri.go:92] found id: ""
	I1219 06:14:00.544655 2064791 logs.go:282] 0 containers: []
	W1219 06:14:00.544662 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:00.544670 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:00.544683 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:00.562442 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:00.562459 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:00.630032 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:00.620824   12211 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:00.621603   12211 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:00.623426   12211 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:00.624012   12211 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:00.625740   12211 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:00.620824   12211 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:00.621603   12211 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:00.623426   12211 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:00.624012   12211 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:00.625740   12211 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:00.630043 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:00.630053 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:00.693056 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:00.693075 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:00.724344 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:00.724362 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:03.282407 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:03.292404 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:03.292463 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:03.322285 2064791 cri.go:92] found id: ""
	I1219 06:14:03.322298 2064791 logs.go:282] 0 containers: []
	W1219 06:14:03.322305 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:03.322310 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:03.322377 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:03.345824 2064791 cri.go:92] found id: ""
	I1219 06:14:03.345838 2064791 logs.go:282] 0 containers: []
	W1219 06:14:03.345846 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:03.345852 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:03.345913 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:03.369194 2064791 cri.go:92] found id: ""
	I1219 06:14:03.369208 2064791 logs.go:282] 0 containers: []
	W1219 06:14:03.369214 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:03.369220 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:03.369280 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:03.393453 2064791 cri.go:92] found id: ""
	I1219 06:14:03.393467 2064791 logs.go:282] 0 containers: []
	W1219 06:14:03.393474 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:03.393479 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:03.393538 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:03.423067 2064791 cri.go:92] found id: ""
	I1219 06:14:03.423082 2064791 logs.go:282] 0 containers: []
	W1219 06:14:03.423088 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:03.423093 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:03.423149 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:03.449404 2064791 cri.go:92] found id: ""
	I1219 06:14:03.449418 2064791 logs.go:282] 0 containers: []
	W1219 06:14:03.449424 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:03.449430 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:03.449491 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:03.483320 2064791 cri.go:92] found id: ""
	I1219 06:14:03.483334 2064791 logs.go:282] 0 containers: []
	W1219 06:14:03.483342 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:03.483349 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:03.483360 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:03.546816 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:03.538361   12314 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:03.539133   12314 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:03.540745   12314 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:03.541423   12314 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:03.542995   12314 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:03.538361   12314 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:03.539133   12314 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:03.540745   12314 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:03.541423   12314 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:03.542995   12314 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:03.546828 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:03.546840 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:03.608924 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:03.608943 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:03.640931 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:03.640947 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:03.698583 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:03.698601 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:06.217289 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:06.228468 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:06.228538 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:06.254249 2064791 cri.go:92] found id: ""
	I1219 06:14:06.254264 2064791 logs.go:282] 0 containers: []
	W1219 06:14:06.254271 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:06.254276 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:06.254335 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:06.278663 2064791 cri.go:92] found id: ""
	I1219 06:14:06.278677 2064791 logs.go:282] 0 containers: []
	W1219 06:14:06.278685 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:06.278691 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:06.278751 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:06.304128 2064791 cri.go:92] found id: ""
	I1219 06:14:06.304143 2064791 logs.go:282] 0 containers: []
	W1219 06:14:06.304150 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:06.304162 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:06.304224 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:06.330238 2064791 cri.go:92] found id: ""
	I1219 06:14:06.330252 2064791 logs.go:282] 0 containers: []
	W1219 06:14:06.330259 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:06.330265 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:06.330326 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:06.354219 2064791 cri.go:92] found id: ""
	I1219 06:14:06.354234 2064791 logs.go:282] 0 containers: []
	W1219 06:14:06.354241 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:06.354246 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:06.354307 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:06.382747 2064791 cri.go:92] found id: ""
	I1219 06:14:06.382762 2064791 logs.go:282] 0 containers: []
	W1219 06:14:06.382769 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:06.382777 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:06.382837 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:06.421656 2064791 cri.go:92] found id: ""
	I1219 06:14:06.421670 2064791 logs.go:282] 0 containers: []
	W1219 06:14:06.421677 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:06.421685 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:06.421694 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:06.498836 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:06.498857 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:06.531636 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:06.531653 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:06.590085 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:06.590106 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:06.608226 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:06.608243 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:06.675159 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:06.667629   12435 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:06.668042   12435 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:06.669562   12435 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:06.669908   12435 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:06.671371   12435 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:06.667629   12435 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:06.668042   12435 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:06.669562   12435 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:06.669908   12435 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:06.671371   12435 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:09.176005 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:09.186839 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:09.186916 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:09.211786 2064791 cri.go:92] found id: ""
	I1219 06:14:09.211800 2064791 logs.go:282] 0 containers: []
	W1219 06:14:09.211807 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:09.211812 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:09.211873 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:09.240415 2064791 cri.go:92] found id: ""
	I1219 06:14:09.240429 2064791 logs.go:282] 0 containers: []
	W1219 06:14:09.240436 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:09.240441 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:09.240503 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:09.266183 2064791 cri.go:92] found id: ""
	I1219 06:14:09.266197 2064791 logs.go:282] 0 containers: []
	W1219 06:14:09.266204 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:09.266209 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:09.266269 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:09.294483 2064791 cri.go:92] found id: ""
	I1219 06:14:09.294497 2064791 logs.go:282] 0 containers: []
	W1219 06:14:09.294504 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:09.294509 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:09.294572 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:09.319997 2064791 cri.go:92] found id: ""
	I1219 06:14:09.320011 2064791 logs.go:282] 0 containers: []
	W1219 06:14:09.320019 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:09.320024 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:09.320113 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:09.346661 2064791 cri.go:92] found id: ""
	I1219 06:14:09.346675 2064791 logs.go:282] 0 containers: []
	W1219 06:14:09.346683 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:09.346688 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:09.346746 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:09.371664 2064791 cri.go:92] found id: ""
	I1219 06:14:09.371690 2064791 logs.go:282] 0 containers: []
	W1219 06:14:09.371698 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:09.371706 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:09.371717 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:09.389515 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:09.389534 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:09.473775 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:09.465363   12518 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:09.465921   12518 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:09.467541   12518 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:09.468108   12518 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:09.469733   12518 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:09.465363   12518 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:09.465921   12518 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:09.467541   12518 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:09.468108   12518 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:09.469733   12518 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:09.473785 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:09.473796 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:09.541712 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:09.541736 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:09.577440 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:09.577456 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:12.133722 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:12.144214 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:12.144277 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:12.170929 2064791 cri.go:92] found id: ""
	I1219 06:14:12.170944 2064791 logs.go:282] 0 containers: []
	W1219 06:14:12.170951 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:12.170956 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:12.171026 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:12.195988 2064791 cri.go:92] found id: ""
	I1219 06:14:12.196002 2064791 logs.go:282] 0 containers: []
	W1219 06:14:12.196008 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:12.196014 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:12.196073 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:12.221254 2064791 cri.go:92] found id: ""
	I1219 06:14:12.221269 2064791 logs.go:282] 0 containers: []
	W1219 06:14:12.221276 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:12.221281 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:12.221346 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:12.246403 2064791 cri.go:92] found id: ""
	I1219 06:14:12.246417 2064791 logs.go:282] 0 containers: []
	W1219 06:14:12.246424 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:12.246430 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:12.246491 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:12.271124 2064791 cri.go:92] found id: ""
	I1219 06:14:12.271139 2064791 logs.go:282] 0 containers: []
	W1219 06:14:12.271145 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:12.271150 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:12.271209 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:12.296180 2064791 cri.go:92] found id: ""
	I1219 06:14:12.296194 2064791 logs.go:282] 0 containers: []
	W1219 06:14:12.296211 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:12.296216 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:12.296284 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:12.322520 2064791 cri.go:92] found id: ""
	I1219 06:14:12.322534 2064791 logs.go:282] 0 containers: []
	W1219 06:14:12.322541 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:12.322548 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:12.322559 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:12.349890 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:12.349907 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:12.407189 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:12.407210 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:12.426453 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:12.426469 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:12.499487 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:12.491549   12644 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:12.491978   12644 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:12.493525   12644 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:12.493875   12644 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:12.495402   12644 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:12.491549   12644 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:12.491978   12644 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:12.493525   12644 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:12.493875   12644 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:12.495402   12644 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:12.499498 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:12.499509 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:15.067160 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:15.078543 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:15.078611 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:15.104838 2064791 cri.go:92] found id: ""
	I1219 06:14:15.104852 2064791 logs.go:282] 0 containers: []
	W1219 06:14:15.104860 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:15.104865 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:15.104933 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:15.130179 2064791 cri.go:92] found id: ""
	I1219 06:14:15.130194 2064791 logs.go:282] 0 containers: []
	W1219 06:14:15.130201 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:15.130207 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:15.130268 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:15.156134 2064791 cri.go:92] found id: ""
	I1219 06:14:15.156147 2064791 logs.go:282] 0 containers: []
	W1219 06:14:15.156154 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:15.156159 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:15.156221 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:15.182543 2064791 cri.go:92] found id: ""
	I1219 06:14:15.182557 2064791 logs.go:282] 0 containers: []
	W1219 06:14:15.182564 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:15.182570 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:15.182631 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:15.212350 2064791 cri.go:92] found id: ""
	I1219 06:14:15.212364 2064791 logs.go:282] 0 containers: []
	W1219 06:14:15.212371 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:15.212376 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:15.212437 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:15.239403 2064791 cri.go:92] found id: ""
	I1219 06:14:15.239418 2064791 logs.go:282] 0 containers: []
	W1219 06:14:15.239425 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:15.239430 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:15.239490 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:15.265288 2064791 cri.go:92] found id: ""
	I1219 06:14:15.265303 2064791 logs.go:282] 0 containers: []
	W1219 06:14:15.265310 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:15.265318 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:15.265328 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:15.322825 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:15.322845 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:15.339946 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:15.339963 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:15.406282 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:15.394886   12732 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:15.395459   12732 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:15.397067   12732 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:15.397414   12732 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:15.399914   12732 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:15.394886   12732 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:15.395459   12732 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:15.397067   12732 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:15.397414   12732 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:15.399914   12732 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:15.406294 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:15.406305 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:15.481322 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:15.481342 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:18.011054 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:18.022305 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:18.022367 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:18.048236 2064791 cri.go:92] found id: ""
	I1219 06:14:18.048250 2064791 logs.go:282] 0 containers: []
	W1219 06:14:18.048257 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:18.048262 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:18.048326 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:18.075811 2064791 cri.go:92] found id: ""
	I1219 06:14:18.075825 2064791 logs.go:282] 0 containers: []
	W1219 06:14:18.075833 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:18.075839 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:18.075911 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:18.101578 2064791 cri.go:92] found id: ""
	I1219 06:14:18.101593 2064791 logs.go:282] 0 containers: []
	W1219 06:14:18.101601 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:18.101607 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:18.101668 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:18.127312 2064791 cri.go:92] found id: ""
	I1219 06:14:18.127327 2064791 logs.go:282] 0 containers: []
	W1219 06:14:18.127335 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:18.127341 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:18.127400 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:18.153616 2064791 cri.go:92] found id: ""
	I1219 06:14:18.153630 2064791 logs.go:282] 0 containers: []
	W1219 06:14:18.153637 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:18.153642 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:18.153702 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:18.177937 2064791 cri.go:92] found id: ""
	I1219 06:14:18.177959 2064791 logs.go:282] 0 containers: []
	W1219 06:14:18.177967 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:18.177972 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:18.178044 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:18.211563 2064791 cri.go:92] found id: ""
	I1219 06:14:18.211576 2064791 logs.go:282] 0 containers: []
	W1219 06:14:18.211583 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:18.211591 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:18.211614 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:18.270162 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:18.270182 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:18.288230 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:18.288247 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:18.351713 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:18.343431   12840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:18.343955   12840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:18.345552   12840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:18.346118   12840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:18.347689   12840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:18.343431   12840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:18.343955   12840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:18.345552   12840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:18.346118   12840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:18.347689   12840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:18.351723 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:18.351734 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:18.415359 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:18.415379 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:20.949383 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:20.959444 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:20.959504 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:20.984028 2064791 cri.go:92] found id: ""
	I1219 06:14:20.984041 2064791 logs.go:282] 0 containers: []
	W1219 06:14:20.984048 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:20.984054 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:20.984114 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:21.011129 2064791 cri.go:92] found id: ""
	I1219 06:14:21.011145 2064791 logs.go:282] 0 containers: []
	W1219 06:14:21.011153 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:21.011159 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:21.011232 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:21.036500 2064791 cri.go:92] found id: ""
	I1219 06:14:21.036515 2064791 logs.go:282] 0 containers: []
	W1219 06:14:21.036522 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:21.036528 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:21.036593 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:21.061075 2064791 cri.go:92] found id: ""
	I1219 06:14:21.061092 2064791 logs.go:282] 0 containers: []
	W1219 06:14:21.061099 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:21.061106 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:21.061164 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:21.086516 2064791 cri.go:92] found id: ""
	I1219 06:14:21.086532 2064791 logs.go:282] 0 containers: []
	W1219 06:14:21.086539 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:21.086545 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:21.086606 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:21.110771 2064791 cri.go:92] found id: ""
	I1219 06:14:21.110791 2064791 logs.go:282] 0 containers: []
	W1219 06:14:21.110798 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:21.110804 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:21.110861 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:21.135223 2064791 cri.go:92] found id: ""
	I1219 06:14:21.135237 2064791 logs.go:282] 0 containers: []
	W1219 06:14:21.135244 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:21.135253 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:21.135262 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:21.198022 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:21.198041 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:21.227058 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:21.227074 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:21.285376 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:21.285395 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:21.302978 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:21.302996 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:21.371361 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:21.363586   12960 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:21.364188   12960 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:21.365313   12960 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:21.365885   12960 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:21.367496   12960 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:21.363586   12960 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:21.364188   12960 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:21.365313   12960 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:21.365885   12960 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:21.367496   12960 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:23.871625 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:23.882253 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:23.882315 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:23.911700 2064791 cri.go:92] found id: ""
	I1219 06:14:23.911715 2064791 logs.go:282] 0 containers: []
	W1219 06:14:23.911722 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:23.911727 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:23.911792 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:23.940526 2064791 cri.go:92] found id: ""
	I1219 06:14:23.940542 2064791 logs.go:282] 0 containers: []
	W1219 06:14:23.940549 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:23.940554 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:23.940613 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:23.965505 2064791 cri.go:92] found id: ""
	I1219 06:14:23.965520 2064791 logs.go:282] 0 containers: []
	W1219 06:14:23.965527 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:23.965532 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:23.965592 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:23.990160 2064791 cri.go:92] found id: ""
	I1219 06:14:23.990174 2064791 logs.go:282] 0 containers: []
	W1219 06:14:23.990180 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:23.990186 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:23.990244 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:24.020703 2064791 cri.go:92] found id: ""
	I1219 06:14:24.020718 2064791 logs.go:282] 0 containers: []
	W1219 06:14:24.020731 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:24.020736 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:24.020818 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:24.045597 2064791 cri.go:92] found id: ""
	I1219 06:14:24.045611 2064791 logs.go:282] 0 containers: []
	W1219 06:14:24.045619 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:24.045625 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:24.045687 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:24.070650 2064791 cri.go:92] found id: ""
	I1219 06:14:24.070665 2064791 logs.go:282] 0 containers: []
	W1219 06:14:24.070673 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:24.070681 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:24.070692 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:24.088118 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:24.088135 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:24.154756 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:24.145796   13051 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:24.146235   13051 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:24.147863   13051 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:24.148542   13051 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:24.150277   13051 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:24.145796   13051 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:24.146235   13051 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:24.147863   13051 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:24.148542   13051 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:24.150277   13051 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:24.154766 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:24.154777 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:24.222682 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:24.222712 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:24.251017 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:24.251036 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:26.810547 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:26.821800 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:26.821882 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:26.851618 2064791 cri.go:92] found id: ""
	I1219 06:14:26.851632 2064791 logs.go:282] 0 containers: []
	W1219 06:14:26.851639 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:26.851644 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:26.851701 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:26.881247 2064791 cri.go:92] found id: ""
	I1219 06:14:26.881261 2064791 logs.go:282] 0 containers: []
	W1219 06:14:26.881268 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:26.881273 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:26.881331 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:26.906685 2064791 cri.go:92] found id: ""
	I1219 06:14:26.906698 2064791 logs.go:282] 0 containers: []
	W1219 06:14:26.906705 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:26.906710 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:26.906769 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:26.930800 2064791 cri.go:92] found id: ""
	I1219 06:14:26.930814 2064791 logs.go:282] 0 containers: []
	W1219 06:14:26.930821 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:26.930826 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:26.930886 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:26.955923 2064791 cri.go:92] found id: ""
	I1219 06:14:26.955936 2064791 logs.go:282] 0 containers: []
	W1219 06:14:26.955943 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:26.955949 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:26.956007 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:26.981009 2064791 cri.go:92] found id: ""
	I1219 06:14:26.981023 2064791 logs.go:282] 0 containers: []
	W1219 06:14:26.981030 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:26.981036 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:26.981100 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:27.008093 2064791 cri.go:92] found id: ""
	I1219 06:14:27.008107 2064791 logs.go:282] 0 containers: []
	W1219 06:14:27.008115 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:27.008123 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:27.008133 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:27.064465 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:27.064484 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:27.082027 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:27.082043 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:27.147050 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:27.138327   13160 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:27.139038   13160 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:27.140978   13160 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:27.141575   13160 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:27.143181   13160 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:27.138327   13160 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:27.139038   13160 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:27.140978   13160 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:27.141575   13160 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:27.143181   13160 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:27.147061 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:27.147072 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:27.209843 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:27.209866 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:29.744581 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:29.755392 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:29.755453 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:29.786638 2064791 cri.go:92] found id: ""
	I1219 06:14:29.786652 2064791 logs.go:282] 0 containers: []
	W1219 06:14:29.786659 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:29.786664 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:29.786724 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:29.812212 2064791 cri.go:92] found id: ""
	I1219 06:14:29.812225 2064791 logs.go:282] 0 containers: []
	W1219 06:14:29.812232 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:29.812237 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:29.812296 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:29.836877 2064791 cri.go:92] found id: ""
	I1219 06:14:29.836892 2064791 logs.go:282] 0 containers: []
	W1219 06:14:29.836899 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:29.836905 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:29.836964 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:29.861702 2064791 cri.go:92] found id: ""
	I1219 06:14:29.861715 2064791 logs.go:282] 0 containers: []
	W1219 06:14:29.861722 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:29.861727 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:29.861786 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:29.885680 2064791 cri.go:92] found id: ""
	I1219 06:14:29.885694 2064791 logs.go:282] 0 containers: []
	W1219 06:14:29.885703 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:29.885708 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:29.885770 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:29.910947 2064791 cri.go:92] found id: ""
	I1219 06:14:29.910961 2064791 logs.go:282] 0 containers: []
	W1219 06:14:29.910968 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:29.910973 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:29.911034 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:29.935050 2064791 cri.go:92] found id: ""
	I1219 06:14:29.935065 2064791 logs.go:282] 0 containers: []
	W1219 06:14:29.935072 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:29.935080 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:29.935090 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:29.998135 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:29.998156 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:30.043603 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:30.043622 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:30.105767 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:30.105788 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:30.123694 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:30.123713 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:30.194778 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:30.185852   13278 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:30.186558   13278 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:30.188234   13278 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:30.188924   13278 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:30.190530   13278 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:30.185852   13278 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:30.186558   13278 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:30.188234   13278 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:30.188924   13278 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:30.190530   13278 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:32.694996 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:32.706674 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:32.706732 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:32.732252 2064791 cri.go:92] found id: ""
	I1219 06:14:32.732268 2064791 logs.go:282] 0 containers: []
	W1219 06:14:32.732276 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:32.732282 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:32.732344 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:32.758653 2064791 cri.go:92] found id: ""
	I1219 06:14:32.758667 2064791 logs.go:282] 0 containers: []
	W1219 06:14:32.758674 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:32.758679 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:32.758739 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:32.784000 2064791 cri.go:92] found id: ""
	I1219 06:14:32.784015 2064791 logs.go:282] 0 containers: []
	W1219 06:14:32.784032 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:32.784037 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:32.784104 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:32.812817 2064791 cri.go:92] found id: ""
	I1219 06:14:32.812840 2064791 logs.go:282] 0 containers: []
	W1219 06:14:32.812847 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:32.812856 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:32.812927 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:32.838382 2064791 cri.go:92] found id: ""
	I1219 06:14:32.838396 2064791 logs.go:282] 0 containers: []
	W1219 06:14:32.838404 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:32.838409 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:32.838470 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:32.865911 2064791 cri.go:92] found id: ""
	I1219 06:14:32.865929 2064791 logs.go:282] 0 containers: []
	W1219 06:14:32.865937 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:32.865944 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:32.866010 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:32.890355 2064791 cri.go:92] found id: ""
	I1219 06:14:32.890369 2064791 logs.go:282] 0 containers: []
	W1219 06:14:32.890376 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:32.890384 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:32.890394 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:32.946230 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:32.946249 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:32.964055 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:32.964071 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:33.030318 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:33.021305   13371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:33.022105   13371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:33.023970   13371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:33.024656   13371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:33.026568   13371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:33.021305   13371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:33.022105   13371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:33.023970   13371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:33.024656   13371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:33.026568   13371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:33.030328 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:33.030341 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:33.097167 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:33.097188 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:35.628021 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:35.638217 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:35.638279 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:35.675187 2064791 cri.go:92] found id: ""
	I1219 06:14:35.675209 2064791 logs.go:282] 0 containers: []
	W1219 06:14:35.675217 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:35.675223 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:35.675283 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:35.703303 2064791 cri.go:92] found id: ""
	I1219 06:14:35.703317 2064791 logs.go:282] 0 containers: []
	W1219 06:14:35.703324 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:35.703329 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:35.703387 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:35.736481 2064791 cri.go:92] found id: ""
	I1219 06:14:35.736495 2064791 logs.go:282] 0 containers: []
	W1219 06:14:35.736502 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:35.736507 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:35.736571 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:35.761459 2064791 cri.go:92] found id: ""
	I1219 06:14:35.761472 2064791 logs.go:282] 0 containers: []
	W1219 06:14:35.761479 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:35.761485 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:35.761542 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:35.785228 2064791 cri.go:92] found id: ""
	I1219 06:14:35.785242 2064791 logs.go:282] 0 containers: []
	W1219 06:14:35.785249 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:35.785255 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:35.785317 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:35.811887 2064791 cri.go:92] found id: ""
	I1219 06:14:35.811901 2064791 logs.go:282] 0 containers: []
	W1219 06:14:35.811908 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:35.811913 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:35.811971 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:35.837382 2064791 cri.go:92] found id: ""
	I1219 06:14:35.837395 2064791 logs.go:282] 0 containers: []
	W1219 06:14:35.837402 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:35.837410 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:35.837420 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:35.893642 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:35.893663 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:35.911983 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:35.911999 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:35.979649 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:35.971161   13476 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:35.971848   13476 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:35.973453   13476 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:35.974018   13476 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:35.975638   13476 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:35.971161   13476 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:35.971848   13476 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:35.973453   13476 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:35.974018   13476 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:35.975638   13476 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:35.979659 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:35.979669 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:36.041989 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:36.042008 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:38.571113 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:38.581755 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:38.581829 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:38.606952 2064791 cri.go:92] found id: ""
	I1219 06:14:38.606977 2064791 logs.go:282] 0 containers: []
	W1219 06:14:38.606985 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:38.607000 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:38.607062 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:38.641457 2064791 cri.go:92] found id: ""
	I1219 06:14:38.641470 2064791 logs.go:282] 0 containers: []
	W1219 06:14:38.641477 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:38.641482 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:38.641544 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:38.675510 2064791 cri.go:92] found id: ""
	I1219 06:14:38.675523 2064791 logs.go:282] 0 containers: []
	W1219 06:14:38.675530 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:38.675536 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:38.675597 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:38.701888 2064791 cri.go:92] found id: ""
	I1219 06:14:38.701902 2064791 logs.go:282] 0 containers: []
	W1219 06:14:38.701909 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:38.701915 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:38.701975 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:38.728277 2064791 cri.go:92] found id: ""
	I1219 06:14:38.728290 2064791 logs.go:282] 0 containers: []
	W1219 06:14:38.728299 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:38.728305 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:38.728365 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:38.755404 2064791 cri.go:92] found id: ""
	I1219 06:14:38.755418 2064791 logs.go:282] 0 containers: []
	W1219 06:14:38.755427 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:38.755433 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:38.755495 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:38.778883 2064791 cri.go:92] found id: ""
	I1219 06:14:38.778896 2064791 logs.go:282] 0 containers: []
	W1219 06:14:38.778903 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:38.778911 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:38.778921 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:38.807023 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:38.807039 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:38.867198 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:38.867217 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:38.885283 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:38.885299 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:38.953980 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:38.945374   13594 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:38.946114   13594 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:38.947740   13594 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:38.948287   13594 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:38.949494   13594 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:38.945374   13594 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:38.946114   13594 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:38.947740   13594 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:38.948287   13594 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:38.949494   13594 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:38.953990 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:38.954002 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:41.516935 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:41.527938 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:41.528001 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:41.553242 2064791 cri.go:92] found id: ""
	I1219 06:14:41.553256 2064791 logs.go:282] 0 containers: []
	W1219 06:14:41.553263 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:41.553268 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:41.553333 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:41.579295 2064791 cri.go:92] found id: ""
	I1219 06:14:41.579309 2064791 logs.go:282] 0 containers: []
	W1219 06:14:41.579316 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:41.579321 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:41.579385 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:41.605144 2064791 cri.go:92] found id: ""
	I1219 06:14:41.605157 2064791 logs.go:282] 0 containers: []
	W1219 06:14:41.605164 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:41.605169 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:41.605237 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:41.629732 2064791 cri.go:92] found id: ""
	I1219 06:14:41.629747 2064791 logs.go:282] 0 containers: []
	W1219 06:14:41.629754 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:41.629760 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:41.629822 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:41.659346 2064791 cri.go:92] found id: ""
	I1219 06:14:41.659361 2064791 logs.go:282] 0 containers: []
	W1219 06:14:41.659368 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:41.659373 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:41.659432 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:41.690573 2064791 cri.go:92] found id: ""
	I1219 06:14:41.690598 2064791 logs.go:282] 0 containers: []
	W1219 06:14:41.690606 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:41.690612 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:41.690681 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:41.732984 2064791 cri.go:92] found id: ""
	I1219 06:14:41.732998 2064791 logs.go:282] 0 containers: []
	W1219 06:14:41.733006 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:41.733013 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:41.733023 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:41.795851 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:41.795871 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:41.825041 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:41.825056 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:41.886639 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:41.886659 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:41.904083 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:41.904100 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:41.971851 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:41.963475   13701 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:41.964169   13701 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:41.965774   13701 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:41.966365   13701 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:41.967995   13701 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:41.963475   13701 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:41.964169   13701 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:41.965774   13701 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:41.966365   13701 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:41.967995   13701 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:44.473271 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:44.483164 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:44.483222 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:44.511046 2064791 cri.go:92] found id: ""
	I1219 06:14:44.511060 2064791 logs.go:282] 0 containers: []
	W1219 06:14:44.511067 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:44.511072 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:44.511131 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:44.536197 2064791 cri.go:92] found id: ""
	I1219 06:14:44.536211 2064791 logs.go:282] 0 containers: []
	W1219 06:14:44.536219 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:44.536224 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:44.536283 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:44.562337 2064791 cri.go:92] found id: ""
	I1219 06:14:44.562354 2064791 logs.go:282] 0 containers: []
	W1219 06:14:44.562360 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:44.562366 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:44.562474 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:44.587553 2064791 cri.go:92] found id: ""
	I1219 06:14:44.587567 2064791 logs.go:282] 0 containers: []
	W1219 06:14:44.587574 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:44.587579 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:44.587637 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:44.614987 2064791 cri.go:92] found id: ""
	I1219 06:14:44.615000 2064791 logs.go:282] 0 containers: []
	W1219 06:14:44.615007 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:44.615012 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:44.615070 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:44.638714 2064791 cri.go:92] found id: ""
	I1219 06:14:44.638727 2064791 logs.go:282] 0 containers: []
	W1219 06:14:44.638734 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:44.638740 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:44.638800 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:44.688380 2064791 cri.go:92] found id: ""
	I1219 06:14:44.688393 2064791 logs.go:282] 0 containers: []
	W1219 06:14:44.688401 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:44.688409 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:44.688419 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:44.752969 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:44.752989 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:44.770407 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:44.770424 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:44.837420 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:44.829128   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:44.829667   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:44.831337   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:44.831916   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:44.833576   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:44.829128   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:44.829667   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:44.831337   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:44.831916   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:44.833576   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:44.837430 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:44.837440 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:44.899538 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:44.899557 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:47.426650 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:47.436749 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:47.436827 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:47.461986 2064791 cri.go:92] found id: ""
	I1219 06:14:47.462000 2064791 logs.go:282] 0 containers: []
	W1219 06:14:47.462007 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:47.462012 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:47.462071 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:47.487738 2064791 cri.go:92] found id: ""
	I1219 06:14:47.487765 2064791 logs.go:282] 0 containers: []
	W1219 06:14:47.487785 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:47.487790 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:47.487934 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:47.517657 2064791 cri.go:92] found id: ""
	I1219 06:14:47.517671 2064791 logs.go:282] 0 containers: []
	W1219 06:14:47.517678 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:47.517683 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:47.517741 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:47.541725 2064791 cri.go:92] found id: ""
	I1219 06:14:47.541740 2064791 logs.go:282] 0 containers: []
	W1219 06:14:47.541747 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:47.541752 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:47.541811 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:47.566613 2064791 cri.go:92] found id: ""
	I1219 06:14:47.566627 2064791 logs.go:282] 0 containers: []
	W1219 06:14:47.566634 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:47.566640 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:47.566698 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:47.593670 2064791 cri.go:92] found id: ""
	I1219 06:14:47.593683 2064791 logs.go:282] 0 containers: []
	W1219 06:14:47.593690 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:47.593705 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:47.593778 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:47.617501 2064791 cri.go:92] found id: ""
	I1219 06:14:47.617516 2064791 logs.go:282] 0 containers: []
	W1219 06:14:47.617523 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:47.617530 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:47.617544 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:47.699175 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:47.685609   13884 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:47.686090   13884 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:47.691538   13884 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:47.692419   13884 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:47.694959   13884 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:47.685609   13884 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:47.686090   13884 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:47.691538   13884 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:47.692419   13884 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:47.694959   13884 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:47.699185 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:47.699195 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:47.763955 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:47.763976 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:47.796195 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:47.796212 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:47.855457 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:47.855477 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:50.373913 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:50.384678 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:50.384743 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:50.409292 2064791 cri.go:92] found id: ""
	I1219 06:14:50.409305 2064791 logs.go:282] 0 containers: []
	W1219 06:14:50.409314 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:50.409319 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:50.409380 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:50.434622 2064791 cri.go:92] found id: ""
	I1219 06:14:50.434637 2064791 logs.go:282] 0 containers: []
	W1219 06:14:50.434644 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:50.434649 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:50.434708 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:50.462727 2064791 cri.go:92] found id: ""
	I1219 06:14:50.462741 2064791 logs.go:282] 0 containers: []
	W1219 06:14:50.462748 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:50.462754 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:50.462818 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:50.487565 2064791 cri.go:92] found id: ""
	I1219 06:14:50.487578 2064791 logs.go:282] 0 containers: []
	W1219 06:14:50.487586 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:50.487593 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:50.487655 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:50.514337 2064791 cri.go:92] found id: ""
	I1219 06:14:50.514351 2064791 logs.go:282] 0 containers: []
	W1219 06:14:50.514358 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:50.514363 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:50.514428 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:50.538808 2064791 cri.go:92] found id: ""
	I1219 06:14:50.538822 2064791 logs.go:282] 0 containers: []
	W1219 06:14:50.538829 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:50.538835 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:50.538900 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:50.562833 2064791 cri.go:92] found id: ""
	I1219 06:14:50.562847 2064791 logs.go:282] 0 containers: []
	W1219 06:14:50.562854 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:50.562862 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:50.562872 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:50.630176 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:50.621836   13988 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:50.622705   13988 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:50.624224   13988 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:50.624675   13988 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:50.626153   13988 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:50.621836   13988 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:50.622705   13988 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:50.624224   13988 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:50.624675   13988 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:50.626153   13988 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:50.630187 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:50.630197 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:50.701427 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:50.701449 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:50.729581 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:50.729602 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:50.786455 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:50.786479 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:53.304847 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:53.315504 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:53.315564 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:53.340157 2064791 cri.go:92] found id: ""
	I1219 06:14:53.340172 2064791 logs.go:282] 0 containers: []
	W1219 06:14:53.340179 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:53.340184 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:53.340242 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:53.368950 2064791 cri.go:92] found id: ""
	I1219 06:14:53.368964 2064791 logs.go:282] 0 containers: []
	W1219 06:14:53.368971 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:53.368976 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:53.369037 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:53.393336 2064791 cri.go:92] found id: ""
	I1219 06:14:53.393349 2064791 logs.go:282] 0 containers: []
	W1219 06:14:53.393356 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:53.393362 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:53.393419 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:53.417054 2064791 cri.go:92] found id: ""
	I1219 06:14:53.417069 2064791 logs.go:282] 0 containers: []
	W1219 06:14:53.417085 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:53.417091 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:53.417163 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:53.440932 2064791 cri.go:92] found id: ""
	I1219 06:14:53.440946 2064791 logs.go:282] 0 containers: []
	W1219 06:14:53.440953 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:53.440958 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:53.441016 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:53.464424 2064791 cri.go:92] found id: ""
	I1219 06:14:53.464437 2064791 logs.go:282] 0 containers: []
	W1219 06:14:53.464444 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:53.464449 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:53.464509 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:53.488126 2064791 cri.go:92] found id: ""
	I1219 06:14:53.488143 2064791 logs.go:282] 0 containers: []
	W1219 06:14:53.488150 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:53.488158 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:53.488168 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:53.558644 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:53.550747   14093 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:53.551416   14093 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:53.553060   14093 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:53.553386   14093 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:53.554859   14093 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:53.550747   14093 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:53.551416   14093 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:53.553060   14093 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:53.553386   14093 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:53.554859   14093 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:53.558655 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:53.558665 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:53.622193 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:53.622214 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:53.650744 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:53.650759 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:53.710733 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:53.710750 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:56.228553 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:56.238967 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:56.239030 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:56.262851 2064791 cri.go:92] found id: ""
	I1219 06:14:56.262864 2064791 logs.go:282] 0 containers: []
	W1219 06:14:56.262872 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:56.262877 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:56.262943 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:56.287030 2064791 cri.go:92] found id: ""
	I1219 06:14:56.287043 2064791 logs.go:282] 0 containers: []
	W1219 06:14:56.287050 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:56.287056 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:56.287118 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:56.312417 2064791 cri.go:92] found id: ""
	I1219 06:14:56.312430 2064791 logs.go:282] 0 containers: []
	W1219 06:14:56.312437 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:56.312442 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:56.312505 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:56.350599 2064791 cri.go:92] found id: ""
	I1219 06:14:56.350613 2064791 logs.go:282] 0 containers: []
	W1219 06:14:56.350622 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:56.350627 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:56.350686 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:56.374515 2064791 cri.go:92] found id: ""
	I1219 06:14:56.374528 2064791 logs.go:282] 0 containers: []
	W1219 06:14:56.374535 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:56.374540 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:56.374596 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:56.399267 2064791 cri.go:92] found id: ""
	I1219 06:14:56.399281 2064791 logs.go:282] 0 containers: []
	W1219 06:14:56.399288 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:56.399293 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:56.399351 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:56.424503 2064791 cri.go:92] found id: ""
	I1219 06:14:56.424516 2064791 logs.go:282] 0 containers: []
	W1219 06:14:56.424523 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:56.424531 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:56.424541 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:56.490954 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:56.490973 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:56.522329 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:56.522345 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:56.582279 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:56.582298 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:56.599656 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:56.599673 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:56.665092 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:56.656833   14220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:56.657522   14220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:56.659110   14220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:56.659433   14220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:56.661032   14220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:56.656833   14220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:56.657522   14220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:56.659110   14220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:56.659433   14220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:56.661032   14220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:59.165361 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:59.178705 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:59.178767 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:59.207415 2064791 cri.go:92] found id: ""
	I1219 06:14:59.207429 2064791 logs.go:282] 0 containers: []
	W1219 06:14:59.207436 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:59.207441 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:59.207499 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:59.231912 2064791 cri.go:92] found id: ""
	I1219 06:14:59.231926 2064791 logs.go:282] 0 containers: []
	W1219 06:14:59.231934 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:59.231939 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:59.232000 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:59.258822 2064791 cri.go:92] found id: ""
	I1219 06:14:59.258836 2064791 logs.go:282] 0 containers: []
	W1219 06:14:59.258843 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:59.258848 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:59.258909 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:59.283942 2064791 cri.go:92] found id: ""
	I1219 06:14:59.283955 2064791 logs.go:282] 0 containers: []
	W1219 06:14:59.283963 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:59.283968 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:59.284026 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:59.311236 2064791 cri.go:92] found id: ""
	I1219 06:14:59.311249 2064791 logs.go:282] 0 containers: []
	W1219 06:14:59.311256 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:59.311262 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:59.311322 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:59.336239 2064791 cri.go:92] found id: ""
	I1219 06:14:59.336253 2064791 logs.go:282] 0 containers: []
	W1219 06:14:59.336260 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:59.336267 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:59.336325 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:59.360395 2064791 cri.go:92] found id: ""
	I1219 06:14:59.360409 2064791 logs.go:282] 0 containers: []
	W1219 06:14:59.360417 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:59.360425 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:59.360435 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:59.423580 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:59.423601 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:59.453489 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:59.453506 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:59.512842 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:59.512862 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:59.530149 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:59.530168 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:59.593869 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:59.584731   14326 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:59.585448   14326 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:59.587088   14326 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:59.587675   14326 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:59.589312   14326 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:59.584731   14326 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:59.585448   14326 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:59.587088   14326 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:59.587675   14326 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:59.589312   14326 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:02.094126 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:02.104778 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:02.104839 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:02.129446 2064791 cri.go:92] found id: ""
	I1219 06:15:02.129462 2064791 logs.go:282] 0 containers: []
	W1219 06:15:02.129469 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:02.129474 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:02.129539 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:02.154875 2064791 cri.go:92] found id: ""
	I1219 06:15:02.154889 2064791 logs.go:282] 0 containers: []
	W1219 06:15:02.154896 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:02.154901 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:02.155006 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:02.180628 2064791 cri.go:92] found id: ""
	I1219 06:15:02.180643 2064791 logs.go:282] 0 containers: []
	W1219 06:15:02.180650 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:02.180655 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:02.180716 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:02.205447 2064791 cri.go:92] found id: ""
	I1219 06:15:02.205462 2064791 logs.go:282] 0 containers: []
	W1219 06:15:02.205469 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:02.205475 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:02.205543 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:02.233523 2064791 cri.go:92] found id: ""
	I1219 06:15:02.233537 2064791 logs.go:282] 0 containers: []
	W1219 06:15:02.233544 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:02.233550 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:02.233610 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:02.259723 2064791 cri.go:92] found id: ""
	I1219 06:15:02.259738 2064791 logs.go:282] 0 containers: []
	W1219 06:15:02.259744 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:02.259750 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:02.259813 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:02.289093 2064791 cri.go:92] found id: ""
	I1219 06:15:02.289108 2064791 logs.go:282] 0 containers: []
	W1219 06:15:02.289115 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:02.289123 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:02.289133 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:02.347737 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:02.347758 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:02.365547 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:02.365564 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:02.433606 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:02.424090   14417 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:02.425124   14417 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:02.425822   14417 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:02.427646   14417 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:02.428231   14417 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:02.424090   14417 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:02.425124   14417 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:02.425822   14417 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:02.427646   14417 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:02.428231   14417 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:02.433616 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:02.433627 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:02.497677 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:02.497697 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:05.027685 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:05.037775 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:05.037845 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:05.062132 2064791 cri.go:92] found id: ""
	I1219 06:15:05.062146 2064791 logs.go:282] 0 containers: []
	W1219 06:15:05.062152 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:05.062157 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:05.062230 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:05.087233 2064791 cri.go:92] found id: ""
	I1219 06:15:05.087247 2064791 logs.go:282] 0 containers: []
	W1219 06:15:05.087254 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:05.087259 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:05.087318 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:05.116140 2064791 cri.go:92] found id: ""
	I1219 06:15:05.116155 2064791 logs.go:282] 0 containers: []
	W1219 06:15:05.116162 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:05.116167 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:05.116229 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:05.141158 2064791 cri.go:92] found id: ""
	I1219 06:15:05.141171 2064791 logs.go:282] 0 containers: []
	W1219 06:15:05.141179 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:05.141184 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:05.141255 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:05.166033 2064791 cri.go:92] found id: ""
	I1219 06:15:05.166046 2064791 logs.go:282] 0 containers: []
	W1219 06:15:05.166053 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:05.166059 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:05.166118 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:05.189930 2064791 cri.go:92] found id: ""
	I1219 06:15:05.189943 2064791 logs.go:282] 0 containers: []
	W1219 06:15:05.189951 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:05.189956 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:05.190013 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:05.217697 2064791 cri.go:92] found id: ""
	I1219 06:15:05.217711 2064791 logs.go:282] 0 containers: []
	W1219 06:15:05.217718 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:05.217726 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:05.217737 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:05.273609 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:05.273629 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:05.291274 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:05.291291 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:05.355137 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:05.346587   14523 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:05.347461   14523 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:05.349261   14523 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:05.349738   14523 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:05.351338   14523 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:05.346587   14523 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:05.347461   14523 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:05.349261   14523 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:05.349738   14523 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:05.351338   14523 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:05.355147 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:05.355158 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:05.418376 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:05.418395 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:07.946932 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:07.957404 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:07.957465 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:07.983257 2064791 cri.go:92] found id: ""
	I1219 06:15:07.983270 2064791 logs.go:282] 0 containers: []
	W1219 06:15:07.983277 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:07.983283 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:07.983344 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:08.010747 2064791 cri.go:92] found id: ""
	I1219 06:15:08.010762 2064791 logs.go:282] 0 containers: []
	W1219 06:15:08.010770 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:08.010776 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:08.010842 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:08.040479 2064791 cri.go:92] found id: ""
	I1219 06:15:08.040493 2064791 logs.go:282] 0 containers: []
	W1219 06:15:08.040500 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:08.040506 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:08.040566 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:08.067147 2064791 cri.go:92] found id: ""
	I1219 06:15:08.067162 2064791 logs.go:282] 0 containers: []
	W1219 06:15:08.067169 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:08.067175 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:08.067238 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:08.096399 2064791 cri.go:92] found id: ""
	I1219 06:15:08.096415 2064791 logs.go:282] 0 containers: []
	W1219 06:15:08.096422 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:08.096430 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:08.096492 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:08.120924 2064791 cri.go:92] found id: ""
	I1219 06:15:08.120938 2064791 logs.go:282] 0 containers: []
	W1219 06:15:08.120945 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:08.120951 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:08.121010 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:08.145044 2064791 cri.go:92] found id: ""
	I1219 06:15:08.145057 2064791 logs.go:282] 0 containers: []
	W1219 06:15:08.145064 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:08.145072 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:08.145082 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:08.201643 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:08.201664 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:08.219150 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:08.219166 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:08.285100 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:08.276907   14628 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:08.277509   14628 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:08.279021   14628 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:08.279543   14628 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:08.281080   14628 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:08.276907   14628 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:08.277509   14628 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:08.279021   14628 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:08.279543   14628 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:08.281080   14628 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:08.285118 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:08.285129 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:08.349440 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:08.349460 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:10.878798 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:10.888854 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:10.888917 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:10.920436 2064791 cri.go:92] found id: ""
	I1219 06:15:10.920450 2064791 logs.go:282] 0 containers: []
	W1219 06:15:10.920457 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:10.920463 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:10.920536 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:10.951229 2064791 cri.go:92] found id: ""
	I1219 06:15:10.951243 2064791 logs.go:282] 0 containers: []
	W1219 06:15:10.951252 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:10.951258 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:10.951315 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:10.980039 2064791 cri.go:92] found id: ""
	I1219 06:15:10.980054 2064791 logs.go:282] 0 containers: []
	W1219 06:15:10.980061 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:10.980066 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:10.980126 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:11.008250 2064791 cri.go:92] found id: ""
	I1219 06:15:11.008265 2064791 logs.go:282] 0 containers: []
	W1219 06:15:11.008273 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:11.008278 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:11.008346 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:11.033554 2064791 cri.go:92] found id: ""
	I1219 06:15:11.033568 2064791 logs.go:282] 0 containers: []
	W1219 06:15:11.033575 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:11.033580 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:11.033641 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:11.058115 2064791 cri.go:92] found id: ""
	I1219 06:15:11.058128 2064791 logs.go:282] 0 containers: []
	W1219 06:15:11.058135 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:11.058141 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:11.058219 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:11.083222 2064791 cri.go:92] found id: ""
	I1219 06:15:11.083236 2064791 logs.go:282] 0 containers: []
	W1219 06:15:11.083242 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:11.083250 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:11.083260 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:11.146681 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:11.146702 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:11.176028 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:11.176047 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:11.233340 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:11.233361 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:11.250941 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:11.250957 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:11.315829 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:11.306797   14744 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:11.307411   14744 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:11.309263   14744 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:11.309846   14744 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:11.311388   14744 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:11.306797   14744 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:11.307411   14744 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:11.309263   14744 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:11.309846   14744 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:11.311388   14744 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:13.816114 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:13.826460 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:13.826527 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:13.850958 2064791 cri.go:92] found id: ""
	I1219 06:15:13.850973 2064791 logs.go:282] 0 containers: []
	W1219 06:15:13.850980 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:13.850988 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:13.851048 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:13.879518 2064791 cri.go:92] found id: ""
	I1219 06:15:13.879538 2064791 logs.go:282] 0 containers: []
	W1219 06:15:13.879546 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:13.879551 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:13.879611 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:13.917876 2064791 cri.go:92] found id: ""
	I1219 06:15:13.917890 2064791 logs.go:282] 0 containers: []
	W1219 06:15:13.917897 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:13.917902 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:13.917965 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:13.957039 2064791 cri.go:92] found id: ""
	I1219 06:15:13.957053 2064791 logs.go:282] 0 containers: []
	W1219 06:15:13.957060 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:13.957065 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:13.957126 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:13.992398 2064791 cri.go:92] found id: ""
	I1219 06:15:13.992412 2064791 logs.go:282] 0 containers: []
	W1219 06:15:13.992419 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:13.992424 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:13.992486 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:14.019915 2064791 cri.go:92] found id: ""
	I1219 06:15:14.019930 2064791 logs.go:282] 0 containers: []
	W1219 06:15:14.019938 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:14.019943 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:14.020004 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:14.045800 2064791 cri.go:92] found id: ""
	I1219 06:15:14.045815 2064791 logs.go:282] 0 containers: []
	W1219 06:15:14.045822 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:14.045830 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:14.045841 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:14.102453 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:14.102472 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:14.120093 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:14.120110 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:14.183187 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:14.175289   14833 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:14.175797   14833 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:14.177338   14833 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:14.177777   14833 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:14.179247   14833 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:14.175289   14833 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:14.175797   14833 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:14.177338   14833 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:14.177777   14833 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:14.179247   14833 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:14.183198 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:14.183209 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:14.246652 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:14.246673 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:16.780257 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:16.790741 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:16.790802 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:16.815777 2064791 cri.go:92] found id: ""
	I1219 06:15:16.815802 2064791 logs.go:282] 0 containers: []
	W1219 06:15:16.815809 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:16.815815 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:16.815890 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:16.841105 2064791 cri.go:92] found id: ""
	I1219 06:15:16.841124 2064791 logs.go:282] 0 containers: []
	W1219 06:15:16.841142 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:16.841148 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:16.841217 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:16.866795 2064791 cri.go:92] found id: ""
	I1219 06:15:16.866820 2064791 logs.go:282] 0 containers: []
	W1219 06:15:16.866827 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:16.866833 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:16.866910 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:16.892692 2064791 cri.go:92] found id: ""
	I1219 06:15:16.892706 2064791 logs.go:282] 0 containers: []
	W1219 06:15:16.892713 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:16.892718 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:16.892803 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:16.926258 2064791 cri.go:92] found id: ""
	I1219 06:15:16.926272 2064791 logs.go:282] 0 containers: []
	W1219 06:15:16.926279 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:16.926285 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:16.926346 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:16.955968 2064791 cri.go:92] found id: ""
	I1219 06:15:16.955982 2064791 logs.go:282] 0 containers: []
	W1219 06:15:16.955989 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:16.955995 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:16.956057 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:16.985158 2064791 cri.go:92] found id: ""
	I1219 06:15:16.985172 2064791 logs.go:282] 0 containers: []
	W1219 06:15:16.985179 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:16.985186 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:16.985196 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:17.043879 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:17.043899 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:17.061599 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:17.061616 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:17.125509 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:17.117153   14937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:17.117733   14937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:17.119480   14937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:17.119896   14937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:17.121399   14937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:17.117153   14937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:17.117733   14937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:17.119480   14937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:17.119896   14937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:17.121399   14937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:17.125519 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:17.125531 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:17.189339 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:17.189359 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:19.721517 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:19.731846 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:19.731916 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:19.758133 2064791 cri.go:92] found id: ""
	I1219 06:15:19.758147 2064791 logs.go:282] 0 containers: []
	W1219 06:15:19.758154 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:19.758160 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:19.758228 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:19.787023 2064791 cri.go:92] found id: ""
	I1219 06:15:19.787037 2064791 logs.go:282] 0 containers: []
	W1219 06:15:19.787045 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:19.787059 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:19.787123 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:19.813855 2064791 cri.go:92] found id: ""
	I1219 06:15:19.813869 2064791 logs.go:282] 0 containers: []
	W1219 06:15:19.813876 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:19.813881 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:19.813944 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:19.838418 2064791 cri.go:92] found id: ""
	I1219 06:15:19.838432 2064791 logs.go:282] 0 containers: []
	W1219 06:15:19.838439 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:19.838444 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:19.838508 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:19.863215 2064791 cri.go:92] found id: ""
	I1219 06:15:19.863229 2064791 logs.go:282] 0 containers: []
	W1219 06:15:19.863240 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:19.863246 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:19.863307 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:19.887732 2064791 cri.go:92] found id: ""
	I1219 06:15:19.887746 2064791 logs.go:282] 0 containers: []
	W1219 06:15:19.887753 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:19.887758 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:19.887815 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:19.930174 2064791 cri.go:92] found id: ""
	I1219 06:15:19.930192 2064791 logs.go:282] 0 containers: []
	W1219 06:15:19.930200 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:19.930208 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:19.930222 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:19.949025 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:19.949041 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:20.022932 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:20.013526   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:20.014350   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:20.016121   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:20.016702   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:20.018424   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:20.013526   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:20.014350   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:20.016121   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:20.016702   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:20.018424   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:20.022944 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:20.022955 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:20.088903 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:20.088924 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:20.117778 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:20.117794 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:22.677536 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:22.687468 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:22.687536 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:22.712714 2064791 cri.go:92] found id: ""
	I1219 06:15:22.712728 2064791 logs.go:282] 0 containers: []
	W1219 06:15:22.712736 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:22.712741 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:22.712816 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:22.736316 2064791 cri.go:92] found id: ""
	I1219 06:15:22.736329 2064791 logs.go:282] 0 containers: []
	W1219 06:15:22.736336 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:22.736341 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:22.736401 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:22.762215 2064791 cri.go:92] found id: ""
	I1219 06:15:22.762229 2064791 logs.go:282] 0 containers: []
	W1219 06:15:22.762236 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:22.762241 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:22.762309 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:22.787061 2064791 cri.go:92] found id: ""
	I1219 06:15:22.787074 2064791 logs.go:282] 0 containers: []
	W1219 06:15:22.787081 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:22.787086 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:22.787146 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:22.814937 2064791 cri.go:92] found id: ""
	I1219 06:15:22.814951 2064791 logs.go:282] 0 containers: []
	W1219 06:15:22.814957 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:22.814963 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:22.815033 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:22.842839 2064791 cri.go:92] found id: ""
	I1219 06:15:22.842853 2064791 logs.go:282] 0 containers: []
	W1219 06:15:22.842859 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:22.842865 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:22.842923 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:22.869394 2064791 cri.go:92] found id: ""
	I1219 06:15:22.869407 2064791 logs.go:282] 0 containers: []
	W1219 06:15:22.869413 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:22.869421 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:22.869430 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:22.926492 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:22.926510 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:22.944210 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:22.944232 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:23.013797 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:23.003493   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:23.005070   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:23.006087   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:23.007846   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:23.008447   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:23.003493   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:23.005070   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:23.006087   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:23.007846   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:23.008447   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:23.013807 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:23.013821 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:23.081279 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:23.081306 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:25.612946 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:25.622887 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:25.622947 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:25.656332 2064791 cri.go:92] found id: ""
	I1219 06:15:25.656346 2064791 logs.go:282] 0 containers: []
	W1219 06:15:25.656353 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:25.656359 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:25.656425 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:25.680887 2064791 cri.go:92] found id: ""
	I1219 06:15:25.680901 2064791 logs.go:282] 0 containers: []
	W1219 06:15:25.680908 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:25.680913 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:25.680981 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:25.705508 2064791 cri.go:92] found id: ""
	I1219 06:15:25.705523 2064791 logs.go:282] 0 containers: []
	W1219 06:15:25.705531 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:25.705536 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:25.705598 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:25.729434 2064791 cri.go:92] found id: ""
	I1219 06:15:25.729447 2064791 logs.go:282] 0 containers: []
	W1219 06:15:25.729454 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:25.729459 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:25.729517 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:25.755351 2064791 cri.go:92] found id: ""
	I1219 06:15:25.755365 2064791 logs.go:282] 0 containers: []
	W1219 06:15:25.755381 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:25.755388 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:25.755449 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:25.782840 2064791 cri.go:92] found id: ""
	I1219 06:15:25.782854 2064791 logs.go:282] 0 containers: []
	W1219 06:15:25.782861 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:25.782866 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:25.782929 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:25.811125 2064791 cri.go:92] found id: ""
	I1219 06:15:25.811139 2064791 logs.go:282] 0 containers: []
	W1219 06:15:25.811155 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:25.811165 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:25.811175 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:25.867579 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:25.867601 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:25.884977 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:25.884996 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:25.983099 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:25.974919   15250 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:25.975374   15250 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:25.977076   15250 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:25.977555   15250 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:25.979165   15250 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:25.974919   15250 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:25.975374   15250 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:25.977076   15250 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:25.977555   15250 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:25.979165   15250 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:25.983110 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:25.983119 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:26.047515 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:26.047534 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:28.576468 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:28.586983 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:28.587044 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:28.612243 2064791 cri.go:92] found id: ""
	I1219 06:15:28.612257 2064791 logs.go:282] 0 containers: []
	W1219 06:15:28.612264 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:28.612270 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:28.612331 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:28.637476 2064791 cri.go:92] found id: ""
	I1219 06:15:28.637490 2064791 logs.go:282] 0 containers: []
	W1219 06:15:28.637496 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:28.637502 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:28.637564 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:28.662778 2064791 cri.go:92] found id: ""
	I1219 06:15:28.662792 2064791 logs.go:282] 0 containers: []
	W1219 06:15:28.662800 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:28.662805 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:28.662864 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:28.687078 2064791 cri.go:92] found id: ""
	I1219 06:15:28.687091 2064791 logs.go:282] 0 containers: []
	W1219 06:15:28.687098 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:28.687105 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:28.687166 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:28.712552 2064791 cri.go:92] found id: ""
	I1219 06:15:28.712566 2064791 logs.go:282] 0 containers: []
	W1219 06:15:28.712572 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:28.712577 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:28.712646 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:28.738798 2064791 cri.go:92] found id: ""
	I1219 06:15:28.738812 2064791 logs.go:282] 0 containers: []
	W1219 06:15:28.738819 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:28.738824 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:28.738881 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:28.767309 2064791 cri.go:92] found id: ""
	I1219 06:15:28.767324 2064791 logs.go:282] 0 containers: []
	W1219 06:15:28.767340 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:28.767349 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:28.767358 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:28.827489 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:28.827509 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:28.844978 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:28.844994 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:28.915425 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:28.906948   15355 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:28.907778   15355 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:28.909411   15355 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:28.909881   15355 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:28.911514   15355 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:28.906948   15355 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:28.907778   15355 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:28.909411   15355 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:28.909881   15355 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:28.911514   15355 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:28.915435 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:28.915445 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:28.980721 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:28.980742 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:31.518692 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:31.528660 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:31.528719 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:31.551685 2064791 cri.go:92] found id: ""
	I1219 06:15:31.551699 2064791 logs.go:282] 0 containers: []
	W1219 06:15:31.551706 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:31.551711 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:31.551772 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:31.578616 2064791 cri.go:92] found id: ""
	I1219 06:15:31.578631 2064791 logs.go:282] 0 containers: []
	W1219 06:15:31.578637 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:31.578643 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:31.578703 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:31.602562 2064791 cri.go:92] found id: ""
	I1219 06:15:31.602576 2064791 logs.go:282] 0 containers: []
	W1219 06:15:31.602582 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:31.602588 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:31.602646 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:31.626697 2064791 cri.go:92] found id: ""
	I1219 06:15:31.626711 2064791 logs.go:282] 0 containers: []
	W1219 06:15:31.626718 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:31.626723 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:31.626786 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:31.650705 2064791 cri.go:92] found id: ""
	I1219 06:15:31.650718 2064791 logs.go:282] 0 containers: []
	W1219 06:15:31.650725 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:31.650730 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:31.650791 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:31.675292 2064791 cri.go:92] found id: ""
	I1219 06:15:31.675305 2064791 logs.go:282] 0 containers: []
	W1219 06:15:31.675312 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:31.675318 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:31.675390 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:31.699969 2064791 cri.go:92] found id: ""
	I1219 06:15:31.699993 2064791 logs.go:282] 0 containers: []
	W1219 06:15:31.700000 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:31.700008 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:31.700018 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:31.765728 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:31.765750 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:31.793450 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:31.793466 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:31.849244 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:31.849262 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:31.866467 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:31.866483 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:31.960156 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:31.933314   15472 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:31.948921   15472 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:31.949480   15472 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:31.951697   15472 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:31.951968   15472 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:31.933314   15472 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:31.948921   15472 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:31.949480   15472 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:31.951697   15472 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:31.951968   15472 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:34.460923 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:34.473072 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:34.473134 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:34.498075 2064791 cri.go:92] found id: ""
	I1219 06:15:34.498089 2064791 logs.go:282] 0 containers: []
	W1219 06:15:34.498097 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:34.498103 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:34.498162 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:34.522785 2064791 cri.go:92] found id: ""
	I1219 06:15:34.522800 2064791 logs.go:282] 0 containers: []
	W1219 06:15:34.522807 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:34.522812 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:34.522871 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:34.550566 2064791 cri.go:92] found id: ""
	I1219 06:15:34.550580 2064791 logs.go:282] 0 containers: []
	W1219 06:15:34.550587 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:34.550592 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:34.550651 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:34.579586 2064791 cri.go:92] found id: ""
	I1219 06:15:34.579600 2064791 logs.go:282] 0 containers: []
	W1219 06:15:34.579607 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:34.579612 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:34.579670 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:34.606248 2064791 cri.go:92] found id: ""
	I1219 06:15:34.606261 2064791 logs.go:282] 0 containers: []
	W1219 06:15:34.606269 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:34.606274 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:34.606335 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:34.634419 2064791 cri.go:92] found id: ""
	I1219 06:15:34.634433 2064791 logs.go:282] 0 containers: []
	W1219 06:15:34.634440 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:34.634446 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:34.634509 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:34.658438 2064791 cri.go:92] found id: ""
	I1219 06:15:34.658451 2064791 logs.go:282] 0 containers: []
	W1219 06:15:34.658458 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:34.658465 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:34.658475 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:34.675933 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:34.675950 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:34.740273 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:34.732883   15564 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:34.733297   15564 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:34.734737   15564 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:34.735051   15564 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:34.736480   15564 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:34.732883   15564 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:34.733297   15564 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:34.734737   15564 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:34.735051   15564 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:34.736480   15564 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:34.740283 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:34.740293 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:34.802357 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:34.802378 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:34.833735 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:34.833751 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:37.390170 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:37.400300 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:37.400358 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:37.425095 2064791 cri.go:92] found id: ""
	I1219 06:15:37.425110 2064791 logs.go:282] 0 containers: []
	W1219 06:15:37.425117 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:37.425122 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:37.425178 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:37.451178 2064791 cri.go:92] found id: ""
	I1219 06:15:37.451192 2064791 logs.go:282] 0 containers: []
	W1219 06:15:37.451199 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:37.451205 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:37.451273 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:37.475828 2064791 cri.go:92] found id: ""
	I1219 06:15:37.475842 2064791 logs.go:282] 0 containers: []
	W1219 06:15:37.475848 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:37.475854 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:37.475911 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:37.499474 2064791 cri.go:92] found id: ""
	I1219 06:15:37.499488 2064791 logs.go:282] 0 containers: []
	W1219 06:15:37.499494 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:37.499500 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:37.499563 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:37.523636 2064791 cri.go:92] found id: ""
	I1219 06:15:37.523649 2064791 logs.go:282] 0 containers: []
	W1219 06:15:37.523656 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:37.523662 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:37.523720 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:37.547846 2064791 cri.go:92] found id: ""
	I1219 06:15:37.547859 2064791 logs.go:282] 0 containers: []
	W1219 06:15:37.547868 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:37.547873 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:37.547929 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:37.574766 2064791 cri.go:92] found id: ""
	I1219 06:15:37.574780 2064791 logs.go:282] 0 containers: []
	W1219 06:15:37.574787 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:37.574795 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:37.574805 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:37.601905 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:37.601923 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:37.657564 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:37.657584 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:37.674777 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:37.674793 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:37.736918 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:37.728853   15679 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:37.729594   15679 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:37.731160   15679 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:37.731519   15679 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:37.733094   15679 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:37.728853   15679 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:37.729594   15679 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:37.731160   15679 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:37.731519   15679 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:37.733094   15679 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:37.736928 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:37.736939 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:40.303769 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:40.313854 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:40.313919 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:40.338505 2064791 cri.go:92] found id: ""
	I1219 06:15:40.338519 2064791 logs.go:282] 0 containers: []
	W1219 06:15:40.338527 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:40.338532 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:40.338594 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:40.363391 2064791 cri.go:92] found id: ""
	I1219 06:15:40.363405 2064791 logs.go:282] 0 containers: []
	W1219 06:15:40.363412 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:40.363417 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:40.363476 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:40.389092 2064791 cri.go:92] found id: ""
	I1219 06:15:40.389105 2064791 logs.go:282] 0 containers: []
	W1219 06:15:40.389113 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:40.389118 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:40.389184 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:40.412993 2064791 cri.go:92] found id: ""
	I1219 06:15:40.413007 2064791 logs.go:282] 0 containers: []
	W1219 06:15:40.413014 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:40.413022 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:40.413087 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:40.438530 2064791 cri.go:92] found id: ""
	I1219 06:15:40.438544 2064791 logs.go:282] 0 containers: []
	W1219 06:15:40.438550 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:40.438556 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:40.438617 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:40.462221 2064791 cri.go:92] found id: ""
	I1219 06:15:40.462235 2064791 logs.go:282] 0 containers: []
	W1219 06:15:40.462242 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:40.462248 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:40.462310 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:40.487125 2064791 cri.go:92] found id: ""
	I1219 06:15:40.487139 2064791 logs.go:282] 0 containers: []
	W1219 06:15:40.487146 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:40.487155 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:40.487165 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:40.543163 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:40.543184 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:40.560362 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:40.560379 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:40.627130 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:40.619309   15773 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:40.619937   15773 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:40.621423   15773 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:40.621900   15773 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:40.623348   15773 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:40.619309   15773 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:40.619937   15773 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:40.621423   15773 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:40.621900   15773 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:40.623348   15773 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:40.627139 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:40.627149 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:40.689654 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:40.689673 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:43.219338 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:43.229544 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:43.229607 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:43.253914 2064791 cri.go:92] found id: ""
	I1219 06:15:43.253935 2064791 logs.go:282] 0 containers: []
	W1219 06:15:43.253941 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:43.253947 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:43.254007 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:43.279019 2064791 cri.go:92] found id: ""
	I1219 06:15:43.279033 2064791 logs.go:282] 0 containers: []
	W1219 06:15:43.279040 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:43.279045 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:43.279106 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:43.304187 2064791 cri.go:92] found id: ""
	I1219 06:15:43.304202 2064791 logs.go:282] 0 containers: []
	W1219 06:15:43.304209 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:43.304216 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:43.304275 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:43.327938 2064791 cri.go:92] found id: ""
	I1219 06:15:43.327951 2064791 logs.go:282] 0 containers: []
	W1219 06:15:43.327958 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:43.327963 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:43.328027 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:43.356864 2064791 cri.go:92] found id: ""
	I1219 06:15:43.356878 2064791 logs.go:282] 0 containers: []
	W1219 06:15:43.356885 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:43.356891 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:43.356958 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:43.381050 2064791 cri.go:92] found id: ""
	I1219 06:15:43.381063 2064791 logs.go:282] 0 containers: []
	W1219 06:15:43.381070 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:43.381076 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:43.381138 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:43.404804 2064791 cri.go:92] found id: ""
	I1219 06:15:43.404818 2064791 logs.go:282] 0 containers: []
	W1219 06:15:43.404825 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:43.404832 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:43.404857 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:43.470026 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:43.461361   15871 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:43.461922   15871 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:43.463417   15871 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:43.464514   15871 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:43.465204   15871 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:43.461361   15871 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:43.461922   15871 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:43.463417   15871 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:43.464514   15871 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:43.465204   15871 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:43.470036 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:43.470050 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:43.533067 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:43.533086 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:43.560074 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:43.560097 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:43.618564 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:43.618582 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:46.135866 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:46.146429 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:46.146493 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:46.180564 2064791 cri.go:92] found id: ""
	I1219 06:15:46.180578 2064791 logs.go:282] 0 containers: []
	W1219 06:15:46.180595 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:46.180601 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:46.180669 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:46.208067 2064791 cri.go:92] found id: ""
	I1219 06:15:46.208081 2064791 logs.go:282] 0 containers: []
	W1219 06:15:46.208087 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:46.208100 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:46.208159 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:46.234676 2064791 cri.go:92] found id: ""
	I1219 06:15:46.234692 2064791 logs.go:282] 0 containers: []
	W1219 06:15:46.234703 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:46.234709 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:46.234775 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:46.259673 2064791 cri.go:92] found id: ""
	I1219 06:15:46.259686 2064791 logs.go:282] 0 containers: []
	W1219 06:15:46.259693 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:46.259707 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:46.259765 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:46.286964 2064791 cri.go:92] found id: ""
	I1219 06:15:46.286979 2064791 logs.go:282] 0 containers: []
	W1219 06:15:46.286986 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:46.286992 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:46.287056 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:46.312785 2064791 cri.go:92] found id: ""
	I1219 06:15:46.312800 2064791 logs.go:282] 0 containers: []
	W1219 06:15:46.312807 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:46.312813 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:46.312875 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:46.339250 2064791 cri.go:92] found id: ""
	I1219 06:15:46.339264 2064791 logs.go:282] 0 containers: []
	W1219 06:15:46.339271 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:46.339279 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:46.339290 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:46.368113 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:46.368129 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:46.423008 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:46.423029 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:46.440481 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:46.440503 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:46.504270 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:46.496670   15994 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:46.497191   15994 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:46.498736   15994 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:46.499181   15994 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:46.500593   15994 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:46.496670   15994 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:46.497191   15994 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:46.498736   15994 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:46.499181   15994 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:46.500593   15994 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:46.504280 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:46.504291 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:49.065736 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:49.075993 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:49.076057 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:49.102714 2064791 cri.go:92] found id: ""
	I1219 06:15:49.102729 2064791 logs.go:282] 0 containers: []
	W1219 06:15:49.102736 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:49.102741 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:49.102808 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:49.131284 2064791 cri.go:92] found id: ""
	I1219 06:15:49.131297 2064791 logs.go:282] 0 containers: []
	W1219 06:15:49.131323 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:49.131328 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:49.131398 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:49.166942 2064791 cri.go:92] found id: ""
	I1219 06:15:49.166955 2064791 logs.go:282] 0 containers: []
	W1219 06:15:49.166962 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:49.166968 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:49.167036 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:49.204412 2064791 cri.go:92] found id: ""
	I1219 06:15:49.204425 2064791 logs.go:282] 0 containers: []
	W1219 06:15:49.204444 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:49.204450 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:49.204522 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:49.232351 2064791 cri.go:92] found id: ""
	I1219 06:15:49.232364 2064791 logs.go:282] 0 containers: []
	W1219 06:15:49.232371 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:49.232377 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:49.232434 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:49.257013 2064791 cri.go:92] found id: ""
	I1219 06:15:49.257028 2064791 logs.go:282] 0 containers: []
	W1219 06:15:49.257046 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:49.257052 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:49.257112 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:49.282354 2064791 cri.go:92] found id: ""
	I1219 06:15:49.282368 2064791 logs.go:282] 0 containers: []
	W1219 06:15:49.282375 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:49.282384 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:49.282396 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:49.351742 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:49.342325   16079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:49.343272   16079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:49.345051   16079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:49.345596   16079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:49.347231   16079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:49.342325   16079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:49.343272   16079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:49.345051   16079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:49.345596   16079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:49.347231   16079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:49.351753 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:49.351764 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:49.416971 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:49.416991 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:49.445804 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:49.445819 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:49.503988 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:49.504006 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:52.023309 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:52.034750 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:52.034819 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:52.061000 2064791 cri.go:92] found id: ""
	I1219 06:15:52.061014 2064791 logs.go:282] 0 containers: []
	W1219 06:15:52.061021 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:52.061026 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:52.061084 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:52.086949 2064791 cri.go:92] found id: ""
	I1219 06:15:52.086964 2064791 logs.go:282] 0 containers: []
	W1219 06:15:52.086971 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:52.086977 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:52.087048 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:52.112534 2064791 cri.go:92] found id: ""
	I1219 06:15:52.112549 2064791 logs.go:282] 0 containers: []
	W1219 06:15:52.112556 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:52.112562 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:52.112635 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:52.137132 2064791 cri.go:92] found id: ""
	I1219 06:15:52.137146 2064791 logs.go:282] 0 containers: []
	W1219 06:15:52.137154 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:52.137160 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:52.137221 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:52.191157 2064791 cri.go:92] found id: ""
	I1219 06:15:52.191171 2064791 logs.go:282] 0 containers: []
	W1219 06:15:52.191178 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:52.191184 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:52.191245 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:52.220921 2064791 cri.go:92] found id: ""
	I1219 06:15:52.220936 2064791 logs.go:282] 0 containers: []
	W1219 06:15:52.220942 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:52.220948 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:52.221009 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:52.250645 2064791 cri.go:92] found id: ""
	I1219 06:15:52.250658 2064791 logs.go:282] 0 containers: []
	W1219 06:15:52.250665 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:52.250673 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:52.250684 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:52.306199 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:52.306222 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:52.323553 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:52.323570 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:52.386634 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:52.378311   16188 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:52.379023   16188 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:52.380829   16188 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:52.381330   16188 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:52.382864   16188 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:52.378311   16188 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:52.379023   16188 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:52.380829   16188 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:52.381330   16188 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:52.382864   16188 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:52.386643 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:52.386653 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:52.450135 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:52.450155 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:54.981347 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:54.991806 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:54.991864 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:55.028687 2064791 cri.go:92] found id: ""
	I1219 06:15:55.028702 2064791 logs.go:282] 0 containers: []
	W1219 06:15:55.028709 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:55.028714 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:55.028797 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:55.053716 2064791 cri.go:92] found id: ""
	I1219 06:15:55.053730 2064791 logs.go:282] 0 containers: []
	W1219 06:15:55.053737 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:55.053784 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:55.053857 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:55.080935 2064791 cri.go:92] found id: ""
	I1219 06:15:55.080949 2064791 logs.go:282] 0 containers: []
	W1219 06:15:55.080957 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:55.080962 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:55.081027 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:55.109910 2064791 cri.go:92] found id: ""
	I1219 06:15:55.109925 2064791 logs.go:282] 0 containers: []
	W1219 06:15:55.109932 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:55.109938 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:55.110005 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:55.138372 2064791 cri.go:92] found id: ""
	I1219 06:15:55.138386 2064791 logs.go:282] 0 containers: []
	W1219 06:15:55.138393 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:55.138400 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:55.138463 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:55.172107 2064791 cri.go:92] found id: ""
	I1219 06:15:55.172121 2064791 logs.go:282] 0 containers: []
	W1219 06:15:55.172128 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:55.172133 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:55.172191 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:55.207670 2064791 cri.go:92] found id: ""
	I1219 06:15:55.207684 2064791 logs.go:282] 0 containers: []
	W1219 06:15:55.207690 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:55.207698 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:55.207708 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:55.273955 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:55.273975 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:55.303942 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:55.303960 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:55.367492 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:55.367517 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:55.384909 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:55.384933 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:55.447954 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:55.439722   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:55.440407   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:55.442003   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:55.442525   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:55.444025   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:55.439722   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:55.440407   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:55.442003   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:55.442525   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:55.444025   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:57.948746 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:57.959024 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:57.959084 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:57.984251 2064791 cri.go:92] found id: ""
	I1219 06:15:57.984264 2064791 logs.go:282] 0 containers: []
	W1219 06:15:57.984271 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:57.984277 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:57.984335 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:58.012444 2064791 cri.go:92] found id: ""
	I1219 06:15:58.012459 2064791 logs.go:282] 0 containers: []
	W1219 06:15:58.012467 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:58.012472 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:58.012531 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:58.040674 2064791 cri.go:92] found id: ""
	I1219 06:15:58.040688 2064791 logs.go:282] 0 containers: []
	W1219 06:15:58.040695 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:58.040700 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:58.040783 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:58.066507 2064791 cri.go:92] found id: ""
	I1219 06:15:58.066522 2064791 logs.go:282] 0 containers: []
	W1219 06:15:58.066529 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:58.066535 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:58.066598 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:58.095594 2064791 cri.go:92] found id: ""
	I1219 06:15:58.095608 2064791 logs.go:282] 0 containers: []
	W1219 06:15:58.095615 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:58.095620 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:58.095680 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:58.121624 2064791 cri.go:92] found id: ""
	I1219 06:15:58.121638 2064791 logs.go:282] 0 containers: []
	W1219 06:15:58.121644 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:58.121650 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:58.121707 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:58.149741 2064791 cri.go:92] found id: ""
	I1219 06:15:58.149755 2064791 logs.go:282] 0 containers: []
	W1219 06:15:58.149762 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:58.149770 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:58.149782 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:58.181272 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:58.181288 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:58.240957 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:58.240987 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:58.258044 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:58.258060 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:58.322228 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:58.314484   16407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:58.315163   16407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:58.316700   16407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:58.317166   16407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:58.318619   16407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:58.314484   16407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:58.315163   16407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:58.316700   16407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:58.317166   16407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:58.318619   16407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:58.322239 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:58.322250 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:16:00.885057 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:16:00.895320 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:16:00.895386 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:16:00.919880 2064791 cri.go:92] found id: ""
	I1219 06:16:00.919914 2064791 logs.go:282] 0 containers: []
	W1219 06:16:00.919922 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:16:00.919927 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:16:00.919995 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:16:00.944225 2064791 cri.go:92] found id: ""
	I1219 06:16:00.944238 2064791 logs.go:282] 0 containers: []
	W1219 06:16:00.944245 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:16:00.944250 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:16:00.944316 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:16:00.969895 2064791 cri.go:92] found id: ""
	I1219 06:16:00.969909 2064791 logs.go:282] 0 containers: []
	W1219 06:16:00.969916 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:16:00.969921 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:16:00.969982 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:16:00.994103 2064791 cri.go:92] found id: ""
	I1219 06:16:00.994118 2064791 logs.go:282] 0 containers: []
	W1219 06:16:00.994134 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:16:00.994141 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:16:00.994224 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:16:01.021151 2064791 cri.go:92] found id: ""
	I1219 06:16:01.021166 2064791 logs.go:282] 0 containers: []
	W1219 06:16:01.021172 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:16:01.021181 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:16:01.021244 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:16:01.046747 2064791 cri.go:92] found id: ""
	I1219 06:16:01.046761 2064791 logs.go:282] 0 containers: []
	W1219 06:16:01.046768 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:16:01.046773 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:16:01.046831 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:16:01.071655 2064791 cri.go:92] found id: ""
	I1219 06:16:01.071672 2064791 logs.go:282] 0 containers: []
	W1219 06:16:01.071679 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:16:01.071686 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:16:01.071696 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:16:01.127618 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:16:01.127636 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:16:01.145631 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:16:01.145650 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:16:01.235681 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:16:01.226719   16501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:01.227442   16501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:01.229160   16501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:01.229807   16501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:01.231513   16501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:16:01.226719   16501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:01.227442   16501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:01.229160   16501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:01.229807   16501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:01.231513   16501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:16:01.235691 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:16:01.235703 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:16:01.299234 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:16:01.299254 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:16:03.829050 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:16:03.839364 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:16:03.839436 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:16:03.871023 2064791 cri.go:92] found id: ""
	I1219 06:16:03.871037 2064791 logs.go:282] 0 containers: []
	W1219 06:16:03.871044 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:16:03.871049 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:16:03.871107 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:16:03.895774 2064791 cri.go:92] found id: ""
	I1219 06:16:03.895788 2064791 logs.go:282] 0 containers: []
	W1219 06:16:03.895795 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:16:03.895800 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:16:03.895859 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:16:03.921890 2064791 cri.go:92] found id: ""
	I1219 06:16:03.921904 2064791 logs.go:282] 0 containers: []
	W1219 06:16:03.921911 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:16:03.921916 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:16:03.921978 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:16:03.946705 2064791 cri.go:92] found id: ""
	I1219 06:16:03.946719 2064791 logs.go:282] 0 containers: []
	W1219 06:16:03.946726 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:16:03.946731 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:16:03.946790 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:16:03.972566 2064791 cri.go:92] found id: ""
	I1219 06:16:03.972579 2064791 logs.go:282] 0 containers: []
	W1219 06:16:03.972605 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:16:03.972610 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:16:03.972676 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:16:03.998217 2064791 cri.go:92] found id: ""
	I1219 06:16:03.998232 2064791 logs.go:282] 0 containers: []
	W1219 06:16:03.998239 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:16:03.998245 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:16:03.998311 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:16:04.024748 2064791 cri.go:92] found id: ""
	I1219 06:16:04.024786 2064791 logs.go:282] 0 containers: []
	W1219 06:16:04.024793 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:16:04.024802 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:16:04.024827 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:16:04.089385 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:16:04.089406 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:16:04.120677 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:16:04.120695 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:16:04.178263 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:16:04.178282 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:16:04.201672 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:16:04.201688 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:16:04.272543 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:16:04.263798   16619 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:04.264930   16619 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:04.265587   16619 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:04.267210   16619 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:04.267480   16619 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:16:04.263798   16619 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:04.264930   16619 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:04.265587   16619 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:04.267210   16619 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:04.267480   16619 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:16:06.772819 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:16:06.784042 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:16:06.784119 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:16:06.809087 2064791 cri.go:92] found id: ""
	I1219 06:16:06.809101 2064791 logs.go:282] 0 containers: []
	W1219 06:16:06.809108 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:16:06.809113 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:16:06.809171 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:16:06.833636 2064791 cri.go:92] found id: ""
	I1219 06:16:06.833649 2064791 logs.go:282] 0 containers: []
	W1219 06:16:06.833656 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:16:06.833661 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:16:06.833726 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:16:06.862766 2064791 cri.go:92] found id: ""
	I1219 06:16:06.862781 2064791 logs.go:282] 0 containers: []
	W1219 06:16:06.862788 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:16:06.862797 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:16:06.862858 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:16:06.887915 2064791 cri.go:92] found id: ""
	I1219 06:16:06.887929 2064791 logs.go:282] 0 containers: []
	W1219 06:16:06.887935 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:16:06.887940 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:16:06.888001 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:16:06.913093 2064791 cri.go:92] found id: ""
	I1219 06:16:06.913107 2064791 logs.go:282] 0 containers: []
	W1219 06:16:06.913114 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:16:06.913119 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:16:06.913184 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:16:06.944662 2064791 cri.go:92] found id: ""
	I1219 06:16:06.944677 2064791 logs.go:282] 0 containers: []
	W1219 06:16:06.944695 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:16:06.944700 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:16:06.944796 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:16:06.976908 2064791 cri.go:92] found id: ""
	I1219 06:16:06.976923 2064791 logs.go:282] 0 containers: []
	W1219 06:16:06.976929 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:16:06.976937 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:16:06.976948 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:16:07.041844 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:16:07.041865 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:16:07.071749 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:16:07.071765 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:16:07.130039 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:16:07.130060 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:16:07.147749 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:16:07.147766 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:16:07.226540 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:16:07.218267   16726 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:07.218857   16726 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:07.220373   16726 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:07.220937   16726 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:07.222460   16726 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:16:07.218267   16726 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:07.218857   16726 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:07.220373   16726 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:07.220937   16726 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:07.222460   16726 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:16:09.726802 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:16:09.737347 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:16:09.737408 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:16:09.761740 2064791 cri.go:92] found id: ""
	I1219 06:16:09.761754 2064791 logs.go:282] 0 containers: []
	W1219 06:16:09.761761 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:16:09.761767 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:16:09.761838 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:16:09.787861 2064791 cri.go:92] found id: ""
	I1219 06:16:09.787876 2064791 logs.go:282] 0 containers: []
	W1219 06:16:09.787883 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:16:09.787888 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:16:09.787950 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:16:09.812599 2064791 cri.go:92] found id: ""
	I1219 06:16:09.812613 2064791 logs.go:282] 0 containers: []
	W1219 06:16:09.812620 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:16:09.812625 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:16:09.812687 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:16:09.837573 2064791 cri.go:92] found id: ""
	I1219 06:16:09.837588 2064791 logs.go:282] 0 containers: []
	W1219 06:16:09.837596 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:16:09.837601 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:16:09.837661 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:16:09.861697 2064791 cri.go:92] found id: ""
	I1219 06:16:09.861712 2064791 logs.go:282] 0 containers: []
	W1219 06:16:09.861718 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:16:09.861723 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:16:09.861788 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:16:09.886842 2064791 cri.go:92] found id: ""
	I1219 06:16:09.886856 2064791 logs.go:282] 0 containers: []
	W1219 06:16:09.886872 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:16:09.886884 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:16:09.886956 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:16:09.912372 2064791 cri.go:92] found id: ""
	I1219 06:16:09.912387 2064791 logs.go:282] 0 containers: []
	W1219 06:16:09.912395 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:16:09.912403 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:16:09.912413 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:16:09.971481 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:16:09.971501 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:16:09.989303 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:16:09.989320 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:16:10.067493 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:16:10.058017   16811 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:10.059169   16811 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:10.059962   16811 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:10.061845   16811 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:10.062203   16811 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:16:10.058017   16811 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:10.059169   16811 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:10.059962   16811 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:10.061845   16811 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:10.062203   16811 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:16:10.067504 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:16:10.067517 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:16:10.132042 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:16:10.132062 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:16:12.664804 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:16:12.675466 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:16:12.675550 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:16:12.704963 2064791 cri.go:92] found id: ""
	I1219 06:16:12.704978 2064791 logs.go:282] 0 containers: []
	W1219 06:16:12.704985 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:16:12.704990 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:16:12.705052 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:16:12.730087 2064791 cri.go:92] found id: ""
	I1219 06:16:12.730103 2064791 logs.go:282] 0 containers: []
	W1219 06:16:12.730110 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:16:12.730115 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:16:12.730178 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:16:12.760566 2064791 cri.go:92] found id: ""
	I1219 06:16:12.760595 2064791 logs.go:282] 0 containers: []
	W1219 06:16:12.760602 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:16:12.760608 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:16:12.760675 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:16:12.785694 2064791 cri.go:92] found id: ""
	I1219 06:16:12.785707 2064791 logs.go:282] 0 containers: []
	W1219 06:16:12.785714 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:16:12.785719 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:16:12.785781 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:16:12.811923 2064791 cri.go:92] found id: ""
	I1219 06:16:12.811938 2064791 logs.go:282] 0 containers: []
	W1219 06:16:12.811956 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:16:12.811962 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:16:12.812036 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:16:12.838424 2064791 cri.go:92] found id: ""
	I1219 06:16:12.838438 2064791 logs.go:282] 0 containers: []
	W1219 06:16:12.838445 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:16:12.838451 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:16:12.838514 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:16:12.864177 2064791 cri.go:92] found id: ""
	I1219 06:16:12.864191 2064791 logs.go:282] 0 containers: []
	W1219 06:16:12.864198 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:16:12.864206 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:16:12.864216 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:16:12.920882 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:16:12.920904 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:16:12.937942 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:16:12.937959 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:16:13.004209 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:16:12.994302   16911 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:12.994966   16911 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:12.996691   16911 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:12.997250   16911 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:12.998931   16911 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:16:12.994302   16911 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:12.994966   16911 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:12.996691   16911 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:12.997250   16911 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:12.998931   16911 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:16:13.004223 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:16:13.004247 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:16:13.067051 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:16:13.067071 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:16:15.596451 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:16:15.606953 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:16:15.607013 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:16:15.639546 2064791 cri.go:92] found id: ""
	I1219 06:16:15.639560 2064791 logs.go:282] 0 containers: []
	W1219 06:16:15.639569 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:16:15.639574 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:16:15.639637 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:16:15.667230 2064791 cri.go:92] found id: ""
	I1219 06:16:15.667245 2064791 logs.go:282] 0 containers: []
	W1219 06:16:15.667252 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:16:15.667257 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:16:15.667321 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:16:15.693059 2064791 cri.go:92] found id: ""
	I1219 06:16:15.693073 2064791 logs.go:282] 0 containers: []
	W1219 06:16:15.693080 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:16:15.693086 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:16:15.693145 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:16:15.718341 2064791 cri.go:92] found id: ""
	I1219 06:16:15.718356 2064791 logs.go:282] 0 containers: []
	W1219 06:16:15.718363 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:16:15.718368 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:16:15.718437 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:16:15.744544 2064791 cri.go:92] found id: ""
	I1219 06:16:15.744559 2064791 logs.go:282] 0 containers: []
	W1219 06:16:15.744566 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:16:15.744571 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:16:15.744632 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:16:15.769809 2064791 cri.go:92] found id: ""
	I1219 06:16:15.769823 2064791 logs.go:282] 0 containers: []
	W1219 06:16:15.769830 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:16:15.769836 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:16:15.769897 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:16:15.793872 2064791 cri.go:92] found id: ""
	I1219 06:16:15.793887 2064791 logs.go:282] 0 containers: []
	W1219 06:16:15.793894 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:16:15.793902 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:16:15.793914 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:16:15.811209 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:16:15.811228 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:16:15.875495 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:16:15.867475   17013 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:15.868031   17013 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:15.869581   17013 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:15.870044   17013 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:15.871521   17013 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:16:15.867475   17013 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:15.868031   17013 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:15.869581   17013 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:15.870044   17013 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:15.871521   17013 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:16:15.875504 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:16:15.875516 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:16:15.938869 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:16:15.938889 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:16:15.967183 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:16:15.967200 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:16:18.524056 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:16:18.534213 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:16:18.534283 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:16:18.558904 2064791 cri.go:92] found id: ""
	I1219 06:16:18.558918 2064791 logs.go:282] 0 containers: []
	W1219 06:16:18.558924 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:16:18.558929 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:16:18.558994 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:16:18.583638 2064791 cri.go:92] found id: ""
	I1219 06:16:18.583653 2064791 logs.go:282] 0 containers: []
	W1219 06:16:18.583661 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:16:18.583666 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:16:18.583726 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:16:18.611047 2064791 cri.go:92] found id: ""
	I1219 06:16:18.611061 2064791 logs.go:282] 0 containers: []
	W1219 06:16:18.611068 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:16:18.611073 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:16:18.611133 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:16:18.635234 2064791 cri.go:92] found id: ""
	I1219 06:16:18.635248 2064791 logs.go:282] 0 containers: []
	W1219 06:16:18.635255 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:16:18.635261 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:16:18.635322 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:16:18.658732 2064791 cri.go:92] found id: ""
	I1219 06:16:18.658747 2064791 logs.go:282] 0 containers: []
	W1219 06:16:18.658754 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:16:18.658759 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:16:18.658819 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:16:18.687782 2064791 cri.go:92] found id: ""
	I1219 06:16:18.687796 2064791 logs.go:282] 0 containers: []
	W1219 06:16:18.687803 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:16:18.687808 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:16:18.687871 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:16:18.713641 2064791 cri.go:92] found id: ""
	I1219 06:16:18.713655 2064791 logs.go:282] 0 containers: []
	W1219 06:16:18.713662 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:16:18.713670 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:16:18.713687 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:16:18.730768 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:16:18.730786 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:16:18.797385 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:16:18.788999   17117 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:18.789629   17117 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:18.791299   17117 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:18.791871   17117 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:18.793492   17117 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:16:18.788999   17117 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:18.789629   17117 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:18.791299   17117 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:18.791871   17117 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:18.793492   17117 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:16:18.797396 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:16:18.797406 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:16:18.861009 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:16:18.861029 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:16:18.889085 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:16:18.889102 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:16:21.448880 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:16:21.458996 2064791 kubeadm.go:602] duration metric: took 4m4.592886052s to restartPrimaryControlPlane
	W1219 06:16:21.459078 2064791 out.go:285] ! Unable to restart control-plane node(s), will reset cluster: <no value>
	I1219 06:16:21.459152 2064791 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1219 06:16:21.873036 2064791 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1219 06:16:21.887075 2064791 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1219 06:16:21.894868 2064791 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1219 06:16:21.894925 2064791 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1219 06:16:21.902909 2064791 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1219 06:16:21.902919 2064791 kubeadm.go:158] found existing configuration files:
	
	I1219 06:16:21.902973 2064791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1219 06:16:21.912282 2064791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1219 06:16:21.912342 2064791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1219 06:16:21.920310 2064791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1219 06:16:21.928090 2064791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1219 06:16:21.928158 2064791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1219 06:16:21.935829 2064791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1219 06:16:21.944085 2064791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1219 06:16:21.944143 2064791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1219 06:16:21.951866 2064791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1219 06:16:21.959883 2064791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1219 06:16:21.959950 2064791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1219 06:16:21.967628 2064791 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1219 06:16:22.006002 2064791 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-rc.1
	I1219 06:16:22.006076 2064791 kubeadm.go:319] [preflight] Running pre-flight checks
	I1219 06:16:22.084826 2064791 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1219 06:16:22.084890 2064791 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1219 06:16:22.084925 2064791 kubeadm.go:319] OS: Linux
	I1219 06:16:22.084969 2064791 kubeadm.go:319] CGROUPS_CPU: enabled
	I1219 06:16:22.085017 2064791 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1219 06:16:22.085068 2064791 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1219 06:16:22.085115 2064791 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1219 06:16:22.085163 2064791 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1219 06:16:22.085209 2064791 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1219 06:16:22.085254 2064791 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1219 06:16:22.085302 2064791 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1219 06:16:22.085348 2064791 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1219 06:16:22.154531 2064791 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1219 06:16:22.154670 2064791 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1219 06:16:22.154781 2064791 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1219 06:16:22.163477 2064791 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1219 06:16:22.169007 2064791 out.go:252]   - Generating certificates and keys ...
	I1219 06:16:22.169099 2064791 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1219 06:16:22.169162 2064791 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1219 06:16:22.169237 2064791 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1219 06:16:22.169297 2064791 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1219 06:16:22.169372 2064791 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1219 06:16:22.169426 2064791 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1219 06:16:22.169488 2064791 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1219 06:16:22.169549 2064791 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1219 06:16:22.169633 2064791 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1219 06:16:22.169704 2064791 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1219 06:16:22.169741 2064791 kubeadm.go:319] [certs] Using the existing "sa" key
	I1219 06:16:22.169795 2064791 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1219 06:16:22.320644 2064791 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1219 06:16:22.743805 2064791 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1219 06:16:22.867878 2064791 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1219 06:16:22.974729 2064791 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1219 06:16:23.395365 2064791 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1219 06:16:23.396030 2064791 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1219 06:16:23.399355 2064791 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1219 06:16:23.402464 2064791 out.go:252]   - Booting up control plane ...
	I1219 06:16:23.402561 2064791 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1219 06:16:23.402637 2064791 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1219 06:16:23.403521 2064791 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1219 06:16:23.423590 2064791 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1219 06:16:23.423990 2064791 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1219 06:16:23.431661 2064791 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1219 06:16:23.431897 2064791 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1219 06:16:23.432074 2064791 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1219 06:16:23.567443 2064791 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1219 06:16:23.567557 2064791 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1219 06:20:23.567966 2064791 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.000561406s
	I1219 06:20:23.567991 2064791 kubeadm.go:319] 
	I1219 06:20:23.568084 2064791 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1219 06:20:23.568128 2064791 kubeadm.go:319] 	- The kubelet is not running
	I1219 06:20:23.568239 2064791 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1219 06:20:23.568244 2064791 kubeadm.go:319] 
	I1219 06:20:23.568354 2064791 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1219 06:20:23.568390 2064791 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1219 06:20:23.568420 2064791 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1219 06:20:23.568423 2064791 kubeadm.go:319] 
	I1219 06:20:23.572732 2064791 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1219 06:20:23.573205 2064791 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1219 06:20:23.573348 2064791 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1219 06:20:23.573651 2064791 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	I1219 06:20:23.573656 2064791 kubeadm.go:319] 
	W1219 06:20:23.573846 2064791 out.go:285] ! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000561406s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	
	To see the stack trace of this error execute with --v=5 or higher
	
	I1219 06:20:23.573948 2064791 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1219 06:20:23.574218 2064791 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1219 06:20:23.984042 2064791 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1219 06:20:23.997740 2064791 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1219 06:20:23.997798 2064791 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1219 06:20:24.008638 2064791 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1219 06:20:24.008649 2064791 kubeadm.go:158] found existing configuration files:
	
	I1219 06:20:24.008724 2064791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1219 06:20:24.018051 2064791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1219 06:20:24.018112 2064791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1219 06:20:24.026089 2064791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1219 06:20:24.034468 2064791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1219 06:20:24.034524 2064791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1219 06:20:24.042330 2064791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1219 06:20:24.050325 2064791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1219 06:20:24.050390 2064791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1219 06:20:24.058263 2064791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1219 06:20:24.066872 2064791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1219 06:20:24.066933 2064791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1219 06:20:24.075206 2064791 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1219 06:20:24.113532 2064791 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-rc.1
	I1219 06:20:24.113595 2064791 kubeadm.go:319] [preflight] Running pre-flight checks
	I1219 06:20:24.190273 2064791 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1219 06:20:24.190347 2064791 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1219 06:20:24.190399 2064791 kubeadm.go:319] OS: Linux
	I1219 06:20:24.190447 2064791 kubeadm.go:319] CGROUPS_CPU: enabled
	I1219 06:20:24.190497 2064791 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1219 06:20:24.190547 2064791 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1219 06:20:24.190597 2064791 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1219 06:20:24.190648 2064791 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1219 06:20:24.190697 2064791 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1219 06:20:24.190745 2064791 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1219 06:20:24.190796 2064791 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1219 06:20:24.190844 2064791 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1219 06:20:24.261095 2064791 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1219 06:20:24.261198 2064791 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1219 06:20:24.261287 2064791 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1219 06:20:24.273343 2064791 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1219 06:20:24.278556 2064791 out.go:252]   - Generating certificates and keys ...
	I1219 06:20:24.278645 2064791 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1219 06:20:24.278707 2064791 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1219 06:20:24.278781 2064791 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1219 06:20:24.278840 2064791 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1219 06:20:24.278908 2064791 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1219 06:20:24.278961 2064791 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1219 06:20:24.279023 2064791 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1219 06:20:24.279082 2064791 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1219 06:20:24.279155 2064791 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1219 06:20:24.279227 2064791 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1219 06:20:24.279263 2064791 kubeadm.go:319] [certs] Using the existing "sa" key
	I1219 06:20:24.279319 2064791 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1219 06:20:24.586742 2064791 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1219 06:20:24.705000 2064791 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1219 06:20:25.117117 2064791 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1219 06:20:25.207046 2064791 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1219 06:20:25.407003 2064791 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1219 06:20:25.408181 2064791 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1219 06:20:25.412332 2064791 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1219 06:20:25.415422 2064791 out.go:252]   - Booting up control plane ...
	I1219 06:20:25.415519 2064791 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1219 06:20:25.415596 2064791 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1219 06:20:25.415664 2064791 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1219 06:20:25.435196 2064791 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1219 06:20:25.435555 2064791 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1219 06:20:25.442782 2064791 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1219 06:20:25.443056 2064791 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1219 06:20:25.443098 2064791 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1219 06:20:25.586740 2064791 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1219 06:20:25.586852 2064791 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1219 06:24:25.586924 2064791 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.000209622s
	I1219 06:24:25.586949 2064791 kubeadm.go:319] 
	I1219 06:24:25.587005 2064791 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1219 06:24:25.587037 2064791 kubeadm.go:319] 	- The kubelet is not running
	I1219 06:24:25.587152 2064791 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1219 06:24:25.587157 2064791 kubeadm.go:319] 
	I1219 06:24:25.587305 2064791 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1219 06:24:25.587351 2064791 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1219 06:24:25.587399 2064791 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1219 06:24:25.587405 2064791 kubeadm.go:319] 
	I1219 06:24:25.592745 2064791 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1219 06:24:25.593206 2064791 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1219 06:24:25.593358 2064791 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1219 06:24:25.593654 2064791 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	I1219 06:24:25.593660 2064791 kubeadm.go:319] 
	I1219 06:24:25.593751 2064791 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1219 06:24:25.593818 2064791 kubeadm.go:403] duration metric: took 12m8.761907578s to StartCluster
	I1219 06:24:25.593849 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:24:25.593915 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:24:25.619076 2064791 cri.go:92] found id: ""
	I1219 06:24:25.619090 2064791 logs.go:282] 0 containers: []
	W1219 06:24:25.619097 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:24:25.619103 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:24:25.619166 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:24:25.645501 2064791 cri.go:92] found id: ""
	I1219 06:24:25.645515 2064791 logs.go:282] 0 containers: []
	W1219 06:24:25.645522 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:24:25.645527 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:24:25.645587 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:24:25.671211 2064791 cri.go:92] found id: ""
	I1219 06:24:25.671225 2064791 logs.go:282] 0 containers: []
	W1219 06:24:25.671232 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:24:25.671237 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:24:25.671297 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:24:25.695076 2064791 cri.go:92] found id: ""
	I1219 06:24:25.695090 2064791 logs.go:282] 0 containers: []
	W1219 06:24:25.695098 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:24:25.695104 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:24:25.695165 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:24:25.720717 2064791 cri.go:92] found id: ""
	I1219 06:24:25.720733 2064791 logs.go:282] 0 containers: []
	W1219 06:24:25.720740 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:24:25.720745 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:24:25.720832 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:24:25.746445 2064791 cri.go:92] found id: ""
	I1219 06:24:25.746460 2064791 logs.go:282] 0 containers: []
	W1219 06:24:25.746466 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:24:25.746478 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:24:25.746541 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:24:25.771217 2064791 cri.go:92] found id: ""
	I1219 06:24:25.771231 2064791 logs.go:282] 0 containers: []
	W1219 06:24:25.771238 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:24:25.771249 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:24:25.771259 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:24:25.827848 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:24:25.827867 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:24:25.845454 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:24:25.845470 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:24:25.916464 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:24:25.906852   20933 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:24:25.907635   20933 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:24:25.909247   20933 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:24:25.909952   20933 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:24:25.911845   20933 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:24:25.906852   20933 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:24:25.907635   20933 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:24:25.909247   20933 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:24:25.909952   20933 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:24:25.911845   20933 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:24:25.916485 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:24:25.916495 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:24:25.988149 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:24:25.988168 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	W1219 06:24:26.019538 2064791 out.go:434] Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000209622s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	
	To see the stack trace of this error execute with --v=5 or higher
	W1219 06:24:26.019579 2064791 out.go:285] * 
	W1219 06:24:26.019696 2064791 out.go:285] X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000209622s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1219 06:24:26.019769 2064791 out.go:285] * 
	W1219 06:24:26.022296 2064791 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1219 06:24:26.028311 2064791 out.go:203] 
	W1219 06:24:26.031204 2064791 out.go:285] X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000209622s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1219 06:24:26.031251 2064791 out.go:285] * Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	W1219 06:24:26.031270 2064791 out.go:285] * Related issue: https://github.com/kubernetes/minikube/issues/4172
	I1219 06:24:26.034280 2064791 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.483798627Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.483867223Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.483965234Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.484038564Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.484104559Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.484166960Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.484226562Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.484289119Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.484361021Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.484452469Z" level=info msg="Connect containerd service"
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.484896289Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this"
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.485577404Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config"
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.498876654Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.499249089Z" level=info msg=serving... address=/run/containerd/containerd.sock
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.498953709Z" level=info msg="Start subscribing containerd event"
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.499359457Z" level=info msg="Start recovering state"
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.541527820Z" level=info msg="Start event monitor"
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.541744389Z" level=info msg="Start cni network conf syncer for default"
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.541814527Z" level=info msg="Start streaming server"
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.541876723Z" level=info msg="Registered namespace \"k8s.io\" with NRI"
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.541934873Z" level=info msg="runtime interface starting up..."
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.541989897Z" level=info msg="starting plugins..."
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.542066690Z" level=info msg="Synchronizing NRI (plugin) with current runtime state"
	Dec 19 06:12:15 functional-006924 systemd[1]: Started containerd.service - containerd container runtime.
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.544548112Z" level=info msg="containerd successfully booted in 0.093860s"
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:24:29.606323   21191 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:24:29.606742   21191 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:24:29.608290   21191 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:24:29.608622   21191 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:24:29.610123   21191 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec19 04:47] overlayfs: idmapped layers are currently not supported
	[Dec19 04:48] overlayfs: idmapped layers are currently not supported
	[Dec19 04:49] overlayfs: idmapped layers are currently not supported
	[Dec19 04:51] overlayfs: idmapped layers are currently not supported
	[Dec19 04:53] overlayfs: idmapped layers are currently not supported
	[Dec19 05:03] overlayfs: idmapped layers are currently not supported
	[Dec19 05:04] overlayfs: idmapped layers are currently not supported
	[Dec19 05:05] overlayfs: idmapped layers are currently not supported
	[Dec19 05:06] overlayfs: idmapped layers are currently not supported
	[ +12.793339] overlayfs: idmapped layers are currently not supported
	[Dec19 05:07] overlayfs: idmapped layers are currently not supported
	[Dec19 05:08] overlayfs: idmapped layers are currently not supported
	[Dec19 05:09] overlayfs: idmapped layers are currently not supported
	[Dec19 05:10] overlayfs: idmapped layers are currently not supported
	[Dec19 05:11] overlayfs: idmapped layers are currently not supported
	[Dec19 05:13] overlayfs: idmapped layers are currently not supported
	[Dec19 05:14] overlayfs: idmapped layers are currently not supported
	[Dec19 05:32] overlayfs: idmapped layers are currently not supported
	[Dec19 05:33] overlayfs: idmapped layers are currently not supported
	[Dec19 05:35] overlayfs: idmapped layers are currently not supported
	[Dec19 05:36] overlayfs: idmapped layers are currently not supported
	[Dec19 05:38] overlayfs: idmapped layers are currently not supported
	[Dec19 05:39] overlayfs: idmapped layers are currently not supported
	[Dec19 05:40] overlayfs: idmapped layers are currently not supported
	[Dec19 05:42] kauditd_printk_skb: 8 callbacks suppressed
	
	
	==> kernel <==
	 06:24:29 up 11:06,  0 user,  load average: 0.20, 0.18, 0.44
	Linux functional-006924 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 19 06:24:26 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:24:26 functional-006924 kubelet[20971]: E1219 06:24:26.716391   20971 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 06:24:26 functional-006924 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 06:24:26 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 06:24:27 functional-006924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 322.
	Dec 19 06:24:27 functional-006924 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:24:27 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:24:27 functional-006924 kubelet[21068]: E1219 06:24:27.467987   21068 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 06:24:27 functional-006924 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 06:24:27 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 06:24:28 functional-006924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 323.
	Dec 19 06:24:28 functional-006924 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:24:28 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:24:28 functional-006924 kubelet[21076]: E1219 06:24:28.203291   21076 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 06:24:28 functional-006924 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 06:24:28 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 06:24:28 functional-006924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 324.
	Dec 19 06:24:28 functional-006924 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:24:28 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:24:28 functional-006924 kubelet[21110]: E1219 06:24:28.975526   21110 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 06:24:28 functional-006924 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 06:24:28 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 06:24:29 functional-006924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 325.
	Dec 19 06:24:29 functional-006924 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:24:29 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	

                                                
                                                
-- /stdout --
helpers_test.go:263: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-006924 -n functional-006924
helpers_test.go:263: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-006924 -n functional-006924: exit status 2 (387.569674ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:263: status error: exit status 2 (may be ok)
helpers_test.go:265: "functional-006924" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/ComponentHealth (2.29s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/InvalidService (0.06s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/InvalidService
functional_test.go:2326: (dbg) Run:  kubectl --context functional-006924 apply -f testdata/invalidsvc.yaml
functional_test.go:2326: (dbg) Non-zero exit: kubectl --context functional-006924 apply -f testdata/invalidsvc.yaml: exit status 1 (58.600295ms)

                                                
                                                
** stderr ** 
	error: error validating "testdata/invalidsvc.yaml": error validating data: failed to download openapi: Get "https://192.168.49.2:8441/openapi/v2?timeout=32s": dial tcp 192.168.49.2:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false

                                                
                                                
** /stderr **
functional_test.go:2328: kubectl --context functional-006924 apply -f testdata/invalidsvc.yaml failed: exit status 1
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/InvalidService (0.06s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DashboardCmd (1.77s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DashboardCmd
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DashboardCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DashboardCmd
functional_test.go:920: (dbg) daemon: [out/minikube-linux-arm64 dashboard --url --port 36195 -p functional-006924 --alsologtostderr -v=1]
functional_test.go:933: output didn't produce a URL
functional_test.go:925: (dbg) stopping [out/minikube-linux-arm64 dashboard --url --port 36195 -p functional-006924 --alsologtostderr -v=1] ...
functional_test.go:925: (dbg) [out/minikube-linux-arm64 dashboard --url --port 36195 -p functional-006924 --alsologtostderr -v=1] stdout:
functional_test.go:925: (dbg) [out/minikube-linux-arm64 dashboard --url --port 36195 -p functional-006924 --alsologtostderr -v=1] stderr:
I1219 06:26:29.919498 2082185 out.go:360] Setting OutFile to fd 1 ...
I1219 06:26:29.919656 2082185 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1219 06:26:29.919673 2082185 out.go:374] Setting ErrFile to fd 2...
I1219 06:26:29.919679 2082185 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1219 06:26:29.919970 2082185 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22230-1998525/.minikube/bin
I1219 06:26:29.920287 2082185 mustload.go:66] Loading cluster: functional-006924
I1219 06:26:29.920820 2082185 config.go:182] Loaded profile config "functional-006924": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
I1219 06:26:29.921345 2082185 cli_runner.go:164] Run: docker container inspect functional-006924 --format={{.State.Status}}
I1219 06:26:29.939885 2082185 host.go:66] Checking if "functional-006924" exists ...
I1219 06:26:29.940198 2082185 cli_runner.go:164] Run: docker system info --format "{{json .}}"
I1219 06:26:30.021153 2082185 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-19 06:26:29.993178107 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:aa
rch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pa
th:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
I1219 06:26:30.021304 2082185 api_server.go:166] Checking apiserver status ...
I1219 06:26:30.021379 2082185 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
I1219 06:26:30.021426 2082185 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
I1219 06:26:30.041820 2082185 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
W1219 06:26:30.155096 2082185 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
stdout:

                                                
                                                
stderr:
I1219 06:26:30.158360 2082185 out.go:179] * The control-plane node functional-006924 apiserver is not running: (state=Stopped)
I1219 06:26:30.161344 2082185 out.go:179]   To start a cluster, run: "minikube start -p functional-006924"
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:224: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DashboardCmd]: network settings <======
helpers_test.go:231: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:239: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DashboardCmd]: docker inspect <======
helpers_test.go:240: (dbg) Run:  docker inspect functional-006924
helpers_test.go:244: (dbg) docker inspect functional-006924:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6",
	        "Created": "2025-12-19T05:57:32.987616309Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 2053574,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-19T05:57:33.050252475Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:1411dfa4fea1291ce69fcd55acb99f3fbff3e701cee30fdd4f0b2561ac0ef6b0",
	        "ResolvConfPath": "/var/lib/docker/containers/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6/hostname",
	        "HostsPath": "/var/lib/docker/containers/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6/hosts",
	        "LogPath": "/var/lib/docker/containers/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6-json.log",
	        "Name": "/functional-006924",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-006924:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-006924",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6",
	                "LowerDir": "/var/lib/docker/overlay2/37e64f46ab514c62069e4338bcac4816059c7746e8935882d7526ec5f3a01f73-init/diff:/var/lib/docker/overlay2/00358d85eab3b52f9d297862c5ac97673efd866f7bb8f8781bf0c1744f50abc5/diff",
	                "MergedDir": "/var/lib/docker/overlay2/37e64f46ab514c62069e4338bcac4816059c7746e8935882d7526ec5f3a01f73/merged",
	                "UpperDir": "/var/lib/docker/overlay2/37e64f46ab514c62069e4338bcac4816059c7746e8935882d7526ec5f3a01f73/diff",
	                "WorkDir": "/var/lib/docker/overlay2/37e64f46ab514c62069e4338bcac4816059c7746e8935882d7526ec5f3a01f73/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-006924",
	                "Source": "/var/lib/docker/volumes/functional-006924/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-006924",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-006924",
	                "name.minikube.sigs.k8s.io": "functional-006924",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "c06ab2bd44169716d410789ed39ed6e7c04e20cbf7fddb96691439282b9c97ca",
	            "SandboxKey": "/var/run/docker/netns/c06ab2bd4416",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34704"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34705"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34708"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34706"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34707"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-006924": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "42:2f:87:6a:a8:7b",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "f63e8dc2cff83663f8a4d14108f192e61e457410fa4fc720cd9630dbf354815d",
	                    "EndpointID": "aa2b1cbd90d5c1f6130481423d97f82d974d4197e41ad0dbe3b7e51b22c8b4cc",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-006924",
	                        "651d0d6ef1db"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:248: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-006924 -n functional-006924
helpers_test.go:248: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-006924 -n functional-006924: exit status 2 (331.925658ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:248: status error: exit status 2 (may be ok)
helpers_test.go:253: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DashboardCmd FAILED: start of post-mortem logs <<<
helpers_test.go:254: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DashboardCmd]: minikube logs <======
helpers_test.go:256: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 logs -n 25
helpers_test.go:261: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DashboardCmd logs: 
-- stdout --
	
	==> Audit <==
	┌───────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│  COMMAND  │                                                                        ARGS                                                                         │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├───────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ service   │ functional-006924 service hello-node --url                                                                                                          │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │                     │
	│ mount     │ -p functional-006924 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun3154741832/001:/mount-9p --alsologtostderr -v=1              │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │                     │
	│ ssh       │ functional-006924 ssh findmnt -T /mount-9p | grep 9p                                                                                                │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │                     │
	│ ssh       │ functional-006924 ssh findmnt -T /mount-9p | grep 9p                                                                                                │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ ssh       │ functional-006924 ssh -- ls -la /mount-9p                                                                                                           │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ ssh       │ functional-006924 ssh cat /mount-9p/test-1766125579269030938                                                                                        │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ ssh       │ functional-006924 ssh mount | grep 9p; ls -la /mount-9p; cat /mount-9p/pod-dates                                                                    │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │                     │
	│ ssh       │ functional-006924 ssh sudo umount -f /mount-9p                                                                                                      │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ mount     │ -p functional-006924 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun1569475337/001:/mount-9p --alsologtostderr -v=1 --port 33725 │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │                     │
	│ ssh       │ functional-006924 ssh findmnt -T /mount-9p | grep 9p                                                                                                │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │                     │
	│ ssh       │ functional-006924 ssh findmnt -T /mount-9p | grep 9p                                                                                                │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ ssh       │ functional-006924 ssh -- ls -la /mount-9p                                                                                                           │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ ssh       │ functional-006924 ssh sudo umount -f /mount-9p                                                                                                      │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │                     │
	│ mount     │ -p functional-006924 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun2343003325/001:/mount1 --alsologtostderr -v=1                │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │                     │
	│ mount     │ -p functional-006924 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun2343003325/001:/mount2 --alsologtostderr -v=1                │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │                     │
	│ ssh       │ functional-006924 ssh findmnt -T /mount1                                                                                                            │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │                     │
	│ mount     │ -p functional-006924 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun2343003325/001:/mount3 --alsologtostderr -v=1                │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │                     │
	│ ssh       │ functional-006924 ssh findmnt -T /mount1                                                                                                            │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ ssh       │ functional-006924 ssh findmnt -T /mount2                                                                                                            │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ ssh       │ functional-006924 ssh findmnt -T /mount3                                                                                                            │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ mount     │ -p functional-006924 --kill=true                                                                                                                    │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │                     │
	│ start     │ -p functional-006924 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-rc.1   │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │                     │
	│ start     │ -p functional-006924 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-rc.1   │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │                     │
	│ start     │ -p functional-006924 --dry-run --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-rc.1             │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │                     │
	│ dashboard │ --url --port 36195 -p functional-006924 --alsologtostderr -v=1                                                                                      │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │                     │
	└───────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/19 06:26:29
	Running on machine: ip-172-31-30-239
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1219 06:26:29.646762 2082108 out.go:360] Setting OutFile to fd 1 ...
	I1219 06:26:29.646954 2082108 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 06:26:29.646981 2082108 out.go:374] Setting ErrFile to fd 2...
	I1219 06:26:29.647003 2082108 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 06:26:29.647305 2082108 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22230-1998525/.minikube/bin
	I1219 06:26:29.652707 2082108 out.go:368] Setting JSON to false
	I1219 06:26:29.653574 2082108 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":40136,"bootTime":1766085454,"procs":158,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1219 06:26:29.653730 2082108 start.go:143] virtualization:  
	I1219 06:26:29.656825 2082108 out.go:179] * [functional-006924] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1219 06:26:29.660666 2082108 notify.go:221] Checking for updates...
	I1219 06:26:29.663753 2082108 out.go:179]   - MINIKUBE_LOCATION=22230
	I1219 06:26:29.666653 2082108 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1219 06:26:29.669563 2082108 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22230-1998525/kubeconfig
	I1219 06:26:29.672409 2082108 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22230-1998525/.minikube
	I1219 06:26:29.675294 2082108 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1219 06:26:29.680090 2082108 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1219 06:26:29.683366 2082108 config.go:182] Loaded profile config "functional-006924": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1219 06:26:29.683923 2082108 driver.go:422] Setting default libvirt URI to qemu:///system
	I1219 06:26:29.729856 2082108 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1219 06:26:29.729981 2082108 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1219 06:26:29.790364 2082108 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-19 06:26:29.781319903 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1219 06:26:29.790469 2082108 docker.go:319] overlay module found
	I1219 06:26:29.793548 2082108 out.go:179] * Using the docker driver based on existing profile
	I1219 06:26:29.796474 2082108 start.go:309] selected driver: docker
	I1219 06:26:29.796494 2082108 start.go:928] validating driver "docker" against &{Name:functional-006924 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APISer
verName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUI
D:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1219 06:26:29.796598 2082108 start.go:939] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1219 06:26:29.796707 2082108 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1219 06:26:29.856113 2082108 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-19 06:26:29.847123734 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1219 06:26:29.856550 2082108 cni.go:84] Creating CNI manager for ""
	I1219 06:26:29.856616 2082108 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1219 06:26:29.856657 2082108 start.go:353] cluster config:
	{Name:functional-006924 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Con
tainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCo
reDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1219 06:26:29.859809 2082108 out.go:179] * dry-run validation complete!
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.483798627Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.483867223Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.483965234Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.484038564Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.484104559Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.484166960Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.484226562Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.484289119Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.484361021Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.484452469Z" level=info msg="Connect containerd service"
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.484896289Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this"
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.485577404Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config"
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.498876654Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.499249089Z" level=info msg=serving... address=/run/containerd/containerd.sock
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.498953709Z" level=info msg="Start subscribing containerd event"
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.499359457Z" level=info msg="Start recovering state"
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.541527820Z" level=info msg="Start event monitor"
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.541744389Z" level=info msg="Start cni network conf syncer for default"
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.541814527Z" level=info msg="Start streaming server"
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.541876723Z" level=info msg="Registered namespace \"k8s.io\" with NRI"
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.541934873Z" level=info msg="runtime interface starting up..."
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.541989897Z" level=info msg="starting plugins..."
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.542066690Z" level=info msg="Synchronizing NRI (plugin) with current runtime state"
	Dec 19 06:12:15 functional-006924 systemd[1]: Started containerd.service - containerd container runtime.
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.544548112Z" level=info msg="containerd successfully booted in 0.093860s"
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:26:31.230659   23240 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:26:31.231369   23240 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:26:31.233221   23240 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:26:31.233677   23240 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:26:31.235229   23240 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec19 04:47] overlayfs: idmapped layers are currently not supported
	[Dec19 04:48] overlayfs: idmapped layers are currently not supported
	[Dec19 04:49] overlayfs: idmapped layers are currently not supported
	[Dec19 04:51] overlayfs: idmapped layers are currently not supported
	[Dec19 04:53] overlayfs: idmapped layers are currently not supported
	[Dec19 05:03] overlayfs: idmapped layers are currently not supported
	[Dec19 05:04] overlayfs: idmapped layers are currently not supported
	[Dec19 05:05] overlayfs: idmapped layers are currently not supported
	[Dec19 05:06] overlayfs: idmapped layers are currently not supported
	[ +12.793339] overlayfs: idmapped layers are currently not supported
	[Dec19 05:07] overlayfs: idmapped layers are currently not supported
	[Dec19 05:08] overlayfs: idmapped layers are currently not supported
	[Dec19 05:09] overlayfs: idmapped layers are currently not supported
	[Dec19 05:10] overlayfs: idmapped layers are currently not supported
	[Dec19 05:11] overlayfs: idmapped layers are currently not supported
	[Dec19 05:13] overlayfs: idmapped layers are currently not supported
	[Dec19 05:14] overlayfs: idmapped layers are currently not supported
	[Dec19 05:32] overlayfs: idmapped layers are currently not supported
	[Dec19 05:33] overlayfs: idmapped layers are currently not supported
	[Dec19 05:35] overlayfs: idmapped layers are currently not supported
	[Dec19 05:36] overlayfs: idmapped layers are currently not supported
	[Dec19 05:38] overlayfs: idmapped layers are currently not supported
	[Dec19 05:39] overlayfs: idmapped layers are currently not supported
	[Dec19 05:40] overlayfs: idmapped layers are currently not supported
	[Dec19 05:42] kauditd_printk_skb: 8 callbacks suppressed
	
	
	==> kernel <==
	 06:26:31 up 11:08,  0 user,  load average: 0.90, 0.34, 0.46
	Linux functional-006924 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 19 06:26:28 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 06:26:28 functional-006924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 484.
	Dec 19 06:26:28 functional-006924 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:26:28 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:26:28 functional-006924 kubelet[23102]: E1219 06:26:28.970036   23102 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 06:26:28 functional-006924 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 06:26:28 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 06:26:29 functional-006924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 485.
	Dec 19 06:26:29 functional-006924 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:26:29 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:26:29 functional-006924 kubelet[23122]: E1219 06:26:29.714644   23122 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 06:26:29 functional-006924 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 06:26:29 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 06:26:30 functional-006924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 486.
	Dec 19 06:26:30 functional-006924 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:26:30 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:26:30 functional-006924 kubelet[23136]: E1219 06:26:30.466056   23136 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 06:26:30 functional-006924 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 06:26:30 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 06:26:31 functional-006924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 487.
	Dec 19 06:26:31 functional-006924 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:26:31 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:26:31 functional-006924 kubelet[23234]: E1219 06:26:31.210166   23234 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 06:26:31 functional-006924 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 06:26:31 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:263: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-006924 -n functional-006924
helpers_test.go:263: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-006924 -n functional-006924: exit status 2 (311.641193ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:263: status error: exit status 2 (may be ok)
helpers_test.go:265: "functional-006924" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DashboardCmd (1.77s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/StatusCmd (3.11s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/StatusCmd
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/StatusCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/StatusCmd
functional_test.go:869: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 status
functional_test.go:869: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-006924 status: exit status 2 (302.312545ms)

                                                
                                                
-- stdout --
	functional-006924
	type: Control Plane
	host: Running
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Configured
	

                                                
                                                
-- /stdout --
functional_test.go:871: failed to run minikube status. args "out/minikube-linux-arm64 -p functional-006924 status" : exit status 2
functional_test.go:875: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 status -f host:{{.Host}},kublet:{{.Kubelet}},apiserver:{{.APIServer}},kubeconfig:{{.Kubeconfig}}
functional_test.go:875: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-006924 status -f host:{{.Host}},kublet:{{.Kubelet}},apiserver:{{.APIServer}},kubeconfig:{{.Kubeconfig}}: exit status 2 (326.949368ms)

                                                
                                                
-- stdout --
	host:Running,kublet:Running,apiserver:Stopped,kubeconfig:Configured

                                                
                                                
-- /stdout --
functional_test.go:877: failed to run minikube status with custom format: args "out/minikube-linux-arm64 -p functional-006924 status -f host:{{.Host}},kublet:{{.Kubelet}},apiserver:{{.APIServer}},kubeconfig:{{.Kubeconfig}}": exit status 2
functional_test.go:887: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 status -o json
functional_test.go:887: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-006924 status -o json: exit status 2 (318.475563ms)

                                                
                                                
-- stdout --
	{"Name":"functional-006924","Host":"Running","Kubelet":"Stopped","APIServer":"Stopped","Kubeconfig":"Configured","Worker":false}

                                                
                                                
-- /stdout --
functional_test.go:889: failed to run minikube status with json output. args "out/minikube-linux-arm64 -p functional-006924 status -o json" : exit status 2
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:224: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/StatusCmd]: network settings <======
helpers_test.go:231: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:239: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/StatusCmd]: docker inspect <======
helpers_test.go:240: (dbg) Run:  docker inspect functional-006924
helpers_test.go:244: (dbg) docker inspect functional-006924:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6",
	        "Created": "2025-12-19T05:57:32.987616309Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 2053574,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-19T05:57:33.050252475Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:1411dfa4fea1291ce69fcd55acb99f3fbff3e701cee30fdd4f0b2561ac0ef6b0",
	        "ResolvConfPath": "/var/lib/docker/containers/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6/hostname",
	        "HostsPath": "/var/lib/docker/containers/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6/hosts",
	        "LogPath": "/var/lib/docker/containers/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6-json.log",
	        "Name": "/functional-006924",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-006924:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-006924",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6",
	                "LowerDir": "/var/lib/docker/overlay2/37e64f46ab514c62069e4338bcac4816059c7746e8935882d7526ec5f3a01f73-init/diff:/var/lib/docker/overlay2/00358d85eab3b52f9d297862c5ac97673efd866f7bb8f8781bf0c1744f50abc5/diff",
	                "MergedDir": "/var/lib/docker/overlay2/37e64f46ab514c62069e4338bcac4816059c7746e8935882d7526ec5f3a01f73/merged",
	                "UpperDir": "/var/lib/docker/overlay2/37e64f46ab514c62069e4338bcac4816059c7746e8935882d7526ec5f3a01f73/diff",
	                "WorkDir": "/var/lib/docker/overlay2/37e64f46ab514c62069e4338bcac4816059c7746e8935882d7526ec5f3a01f73/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-006924",
	                "Source": "/var/lib/docker/volumes/functional-006924/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-006924",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-006924",
	                "name.minikube.sigs.k8s.io": "functional-006924",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "c06ab2bd44169716d410789ed39ed6e7c04e20cbf7fddb96691439282b9c97ca",
	            "SandboxKey": "/var/run/docker/netns/c06ab2bd4416",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34704"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34705"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34708"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34706"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34707"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-006924": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "42:2f:87:6a:a8:7b",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "f63e8dc2cff83663f8a4d14108f192e61e457410fa4fc720cd9630dbf354815d",
	                    "EndpointID": "aa2b1cbd90d5c1f6130481423d97f82d974d4197e41ad0dbe3b7e51b22c8b4cc",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-006924",
	                        "651d0d6ef1db"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:248: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-006924 -n functional-006924
helpers_test.go:248: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-006924 -n functional-006924: exit status 2 (348.151332ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:248: status error: exit status 2 (may be ok)
helpers_test.go:253: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/StatusCmd FAILED: start of post-mortem logs <<<
helpers_test.go:254: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/StatusCmd]: minikube logs <======
helpers_test.go:256: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 logs -n 25
helpers_test.go:261: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/StatusCmd logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                        ARGS                                                                         │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ service │ functional-006924 service list                                                                                                                      │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │                     │
	│ service │ functional-006924 service list -o json                                                                                                              │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │                     │
	│ service │ functional-006924 service --namespace=default --https --url hello-node                                                                              │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │                     │
	│ service │ functional-006924 service hello-node --url --format={{.IP}}                                                                                         │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │                     │
	│ service │ functional-006924 service hello-node --url                                                                                                          │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │                     │
	│ mount   │ -p functional-006924 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun3154741832/001:/mount-9p --alsologtostderr -v=1              │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │                     │
	│ ssh     │ functional-006924 ssh findmnt -T /mount-9p | grep 9p                                                                                                │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │                     │
	│ ssh     │ functional-006924 ssh findmnt -T /mount-9p | grep 9p                                                                                                │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ ssh     │ functional-006924 ssh -- ls -la /mount-9p                                                                                                           │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ ssh     │ functional-006924 ssh cat /mount-9p/test-1766125579269030938                                                                                        │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ ssh     │ functional-006924 ssh mount | grep 9p; ls -la /mount-9p; cat /mount-9p/pod-dates                                                                    │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │                     │
	│ ssh     │ functional-006924 ssh sudo umount -f /mount-9p                                                                                                      │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ mount   │ -p functional-006924 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun1569475337/001:/mount-9p --alsologtostderr -v=1 --port 33725 │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │                     │
	│ ssh     │ functional-006924 ssh findmnt -T /mount-9p | grep 9p                                                                                                │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │                     │
	│ ssh     │ functional-006924 ssh findmnt -T /mount-9p | grep 9p                                                                                                │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ ssh     │ functional-006924 ssh -- ls -la /mount-9p                                                                                                           │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ ssh     │ functional-006924 ssh sudo umount -f /mount-9p                                                                                                      │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │                     │
	│ mount   │ -p functional-006924 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun2343003325/001:/mount1 --alsologtostderr -v=1                │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │                     │
	│ mount   │ -p functional-006924 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun2343003325/001:/mount2 --alsologtostderr -v=1                │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │                     │
	│ ssh     │ functional-006924 ssh findmnt -T /mount1                                                                                                            │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │                     │
	│ mount   │ -p functional-006924 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun2343003325/001:/mount3 --alsologtostderr -v=1                │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │                     │
	│ ssh     │ functional-006924 ssh findmnt -T /mount1                                                                                                            │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ ssh     │ functional-006924 ssh findmnt -T /mount2                                                                                                            │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ ssh     │ functional-006924 ssh findmnt -T /mount3                                                                                                            │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ mount   │ -p functional-006924 --kill=true                                                                                                                    │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │                     │
	└─────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/19 06:12:12
	Running on machine: ip-172-31-30-239
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1219 06:12:12.743158 2064791 out.go:360] Setting OutFile to fd 1 ...
	I1219 06:12:12.743269 2064791 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 06:12:12.743273 2064791 out.go:374] Setting ErrFile to fd 2...
	I1219 06:12:12.743277 2064791 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 06:12:12.743528 2064791 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22230-1998525/.minikube/bin
	I1219 06:12:12.743902 2064791 out.go:368] Setting JSON to false
	I1219 06:12:12.744837 2064791 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":39279,"bootTime":1766085454,"procs":155,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1219 06:12:12.744896 2064791 start.go:143] virtualization:  
	I1219 06:12:12.748217 2064791 out.go:179] * [functional-006924] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1219 06:12:12.751238 2064791 out.go:179]   - MINIKUBE_LOCATION=22230
	I1219 06:12:12.751295 2064791 notify.go:221] Checking for updates...
	I1219 06:12:12.757153 2064791 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1219 06:12:12.760103 2064791 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22230-1998525/kubeconfig
	I1219 06:12:12.763068 2064791 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22230-1998525/.minikube
	I1219 06:12:12.765948 2064791 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1219 06:12:12.768902 2064791 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1219 06:12:12.772437 2064791 config.go:182] Loaded profile config "functional-006924": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1219 06:12:12.772538 2064791 driver.go:422] Setting default libvirt URI to qemu:///system
	I1219 06:12:12.804424 2064791 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1219 06:12:12.804525 2064791 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1219 06:12:12.859954 2064791 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-12-19 06:12:12.850685523 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1219 06:12:12.860047 2064791 docker.go:319] overlay module found
	I1219 06:12:12.863098 2064791 out.go:179] * Using the docker driver based on existing profile
	I1219 06:12:12.866014 2064791 start.go:309] selected driver: docker
	I1219 06:12:12.866030 2064791 start.go:928] validating driver "docker" against &{Name:functional-006924 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APISer
verName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableC
oreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1219 06:12:12.866122 2064791 start.go:939] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1219 06:12:12.866232 2064791 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1219 06:12:12.920329 2064791 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-12-19 06:12:12.911575892 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1219 06:12:12.920732 2064791 start_flags.go:993] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1219 06:12:12.920793 2064791 cni.go:84] Creating CNI manager for ""
	I1219 06:12:12.920848 2064791 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1219 06:12:12.920889 2064791 start.go:353] cluster config:
	{Name:functional-006924 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Con
tainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCo
reDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1219 06:12:12.924076 2064791 out.go:179] * Starting "functional-006924" primary control-plane node in "functional-006924" cluster
	I1219 06:12:12.926767 2064791 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1219 06:12:12.929823 2064791 out.go:179] * Pulling base image v0.0.48-1765966054-22186 ...
	I1219 06:12:12.932605 2064791 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1219 06:12:12.932642 2064791 preload.go:203] Found local preload: /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4
	I1219 06:12:12.932650 2064791 cache.go:65] Caching tarball of preloaded images
	I1219 06:12:12.932677 2064791 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon
	I1219 06:12:12.932745 2064791 preload.go:238] Found /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4 in cache, skipping download
	I1219 06:12:12.932796 2064791 cache.go:68] Finished verifying existence of preloaded tar for v1.35.0-rc.1 on containerd
	I1219 06:12:12.932911 2064791 profile.go:143] Saving config to /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/config.json ...
	I1219 06:12:12.951789 2064791 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon, skipping pull
	I1219 06:12:12.951800 2064791 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 exists in daemon, skipping load
	I1219 06:12:12.951830 2064791 cache.go:243] Successfully downloaded all kic artifacts
	I1219 06:12:12.951863 2064791 start.go:360] acquireMachinesLock for functional-006924: {Name:mkc84f48e83d18024791d45db780f3ccd746613a Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1219 06:12:12.951927 2064791 start.go:364] duration metric: took 47.033µs to acquireMachinesLock for "functional-006924"
	I1219 06:12:12.951947 2064791 start.go:96] Skipping create...Using existing machine configuration
	I1219 06:12:12.951951 2064791 fix.go:54] fixHost starting: 
	I1219 06:12:12.952210 2064791 cli_runner.go:164] Run: docker container inspect functional-006924 --format={{.State.Status}}
	I1219 06:12:12.969279 2064791 fix.go:112] recreateIfNeeded on functional-006924: state=Running err=<nil>
	W1219 06:12:12.969299 2064791 fix.go:138] unexpected machine state, will restart: <nil>
	I1219 06:12:12.972432 2064791 out.go:252] * Updating the running docker "functional-006924" container ...
	I1219 06:12:12.972457 2064791 machine.go:94] provisionDockerMachine start ...
	I1219 06:12:12.972536 2064791 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:12:12.989705 2064791 main.go:144] libmachine: Using SSH client type: native
	I1219 06:12:12.990045 2064791 main.go:144] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db5e0] 0x3ddae0 <nil>  [] 0s} 127.0.0.1 34704 <nil> <nil>}
	I1219 06:12:12.990052 2064791 main.go:144] libmachine: About to run SSH command:
	hostname
	I1219 06:12:13.144528 2064791 main.go:144] libmachine: SSH cmd err, output: <nil>: functional-006924
	
	I1219 06:12:13.144543 2064791 ubuntu.go:182] provisioning hostname "functional-006924"
	I1219 06:12:13.144626 2064791 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:12:13.163735 2064791 main.go:144] libmachine: Using SSH client type: native
	I1219 06:12:13.164043 2064791 main.go:144] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db5e0] 0x3ddae0 <nil>  [] 0s} 127.0.0.1 34704 <nil> <nil>}
	I1219 06:12:13.164057 2064791 main.go:144] libmachine: About to run SSH command:
	sudo hostname functional-006924 && echo "functional-006924" | sudo tee /etc/hostname
	I1219 06:12:13.331538 2064791 main.go:144] libmachine: SSH cmd err, output: <nil>: functional-006924
	
	I1219 06:12:13.331610 2064791 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:12:13.350490 2064791 main.go:144] libmachine: Using SSH client type: native
	I1219 06:12:13.350800 2064791 main.go:144] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db5e0] 0x3ddae0 <nil>  [] 0s} 127.0.0.1 34704 <nil> <nil>}
	I1219 06:12:13.350813 2064791 main.go:144] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-006924' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-006924/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-006924' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1219 06:12:13.509192 2064791 main.go:144] libmachine: SSH cmd err, output: <nil>: 
	I1219 06:12:13.509210 2064791 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/22230-1998525/.minikube CaCertPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/22230-1998525/.minikube}
	I1219 06:12:13.509245 2064791 ubuntu.go:190] setting up certificates
	I1219 06:12:13.509254 2064791 provision.go:84] configureAuth start
	I1219 06:12:13.509315 2064791 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-006924
	I1219 06:12:13.528067 2064791 provision.go:143] copyHostCerts
	I1219 06:12:13.528151 2064791 exec_runner.go:144] found /home/jenkins/minikube-integration/22230-1998525/.minikube/key.pem, removing ...
	I1219 06:12:13.528164 2064791 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22230-1998525/.minikube/key.pem
	I1219 06:12:13.528239 2064791 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/22230-1998525/.minikube/key.pem (1671 bytes)
	I1219 06:12:13.528339 2064791 exec_runner.go:144] found /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.pem, removing ...
	I1219 06:12:13.528348 2064791 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.pem
	I1219 06:12:13.528375 2064791 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.pem (1078 bytes)
	I1219 06:12:13.528452 2064791 exec_runner.go:144] found /home/jenkins/minikube-integration/22230-1998525/.minikube/cert.pem, removing ...
	I1219 06:12:13.528456 2064791 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22230-1998525/.minikube/cert.pem
	I1219 06:12:13.528480 2064791 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/22230-1998525/.minikube/cert.pem (1123 bytes)
	I1219 06:12:13.528529 2064791 provision.go:117] generating server cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca-key.pem org=jenkins.functional-006924 san=[127.0.0.1 192.168.49.2 functional-006924 localhost minikube]
	I1219 06:12:13.839797 2064791 provision.go:177] copyRemoteCerts
	I1219 06:12:13.839849 2064791 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1219 06:12:13.839888 2064791 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:12:13.857134 2064791 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:12:13.968475 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1219 06:12:13.985747 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1219 06:12:14.005527 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1219 06:12:14.024925 2064791 provision.go:87] duration metric: took 515.64823ms to configureAuth
	I1219 06:12:14.024943 2064791 ubuntu.go:206] setting minikube options for container-runtime
	I1219 06:12:14.025140 2064791 config.go:182] Loaded profile config "functional-006924": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1219 06:12:14.025146 2064791 machine.go:97] duration metric: took 1.052684031s to provisionDockerMachine
	I1219 06:12:14.025152 2064791 start.go:293] postStartSetup for "functional-006924" (driver="docker")
	I1219 06:12:14.025162 2064791 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1219 06:12:14.025218 2064791 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1219 06:12:14.025263 2064791 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:12:14.043178 2064791 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:12:14.148605 2064791 ssh_runner.go:195] Run: cat /etc/os-release
	I1219 06:12:14.151719 2064791 main.go:144] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1219 06:12:14.151753 2064791 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1219 06:12:14.151766 2064791 filesync.go:126] Scanning /home/jenkins/minikube-integration/22230-1998525/.minikube/addons for local assets ...
	I1219 06:12:14.151823 2064791 filesync.go:126] Scanning /home/jenkins/minikube-integration/22230-1998525/.minikube/files for local assets ...
	I1219 06:12:14.151902 2064791 filesync.go:149] local asset: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem -> 20003862.pem in /etc/ssl/certs
	I1219 06:12:14.151975 2064791 filesync.go:149] local asset: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/test/nested/copy/2000386/hosts -> hosts in /etc/test/nested/copy/2000386
	I1219 06:12:14.152026 2064791 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/2000386
	I1219 06:12:14.159336 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem --> /etc/ssl/certs/20003862.pem (1708 bytes)
	I1219 06:12:14.177055 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/test/nested/copy/2000386/hosts --> /etc/test/nested/copy/2000386/hosts (40 bytes)
	I1219 06:12:14.195053 2064791 start.go:296] duration metric: took 169.886807ms for postStartSetup
	I1219 06:12:14.195138 2064791 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1219 06:12:14.195175 2064791 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:12:14.212871 2064791 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:12:14.317767 2064791 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1219 06:12:14.322386 2064791 fix.go:56] duration metric: took 1.37042768s for fixHost
	I1219 06:12:14.322401 2064791 start.go:83] releasing machines lock for "functional-006924", held for 1.370467196s
	I1219 06:12:14.322474 2064791 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-006924
	I1219 06:12:14.339208 2064791 ssh_runner.go:195] Run: cat /version.json
	I1219 06:12:14.339250 2064791 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:12:14.339514 2064791 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1219 06:12:14.339574 2064791 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:12:14.363989 2064791 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:12:14.366009 2064791 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:12:14.468260 2064791 ssh_runner.go:195] Run: systemctl --version
	I1219 06:12:14.559810 2064791 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1219 06:12:14.563901 2064791 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1219 06:12:14.563968 2064791 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1219 06:12:14.571453 2064791 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1219 06:12:14.571466 2064791 start.go:496] detecting cgroup driver to use...
	I1219 06:12:14.571496 2064791 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1219 06:12:14.571541 2064791 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1219 06:12:14.588970 2064791 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1219 06:12:14.603919 2064791 docker.go:218] disabling cri-docker service (if available) ...
	I1219 06:12:14.603971 2064791 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1219 06:12:14.620412 2064791 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1219 06:12:14.634912 2064791 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1219 06:12:14.757018 2064791 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1219 06:12:14.879281 2064791 docker.go:234] disabling docker service ...
	I1219 06:12:14.879341 2064791 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1219 06:12:14.894279 2064791 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1219 06:12:14.907362 2064791 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1219 06:12:15.033676 2064791 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1219 06:12:15.155919 2064791 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1219 06:12:15.169590 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1219 06:12:15.184917 2064791 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1219 06:12:15.194691 2064791 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1219 06:12:15.203742 2064791 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1219 06:12:15.203801 2064791 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1219 06:12:15.212945 2064791 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1219 06:12:15.221903 2064791 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1219 06:12:15.231019 2064791 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1219 06:12:15.239988 2064791 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1219 06:12:15.248292 2064791 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1219 06:12:15.257554 2064791 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1219 06:12:15.266460 2064791 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1219 06:12:15.275351 2064791 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1219 06:12:15.282864 2064791 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1219 06:12:15.290662 2064791 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1219 06:12:15.400462 2064791 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1219 06:12:15.544853 2064791 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1219 06:12:15.544914 2064791 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1219 06:12:15.549076 2064791 start.go:564] Will wait 60s for crictl version
	I1219 06:12:15.549132 2064791 ssh_runner.go:195] Run: which crictl
	I1219 06:12:15.552855 2064791 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1219 06:12:15.578380 2064791 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.2.0
	RuntimeApiVersion:  v1
	I1219 06:12:15.578461 2064791 ssh_runner.go:195] Run: containerd --version
	I1219 06:12:15.600920 2064791 ssh_runner.go:195] Run: containerd --version
	I1219 06:12:15.626436 2064791 out.go:179] * Preparing Kubernetes v1.35.0-rc.1 on containerd 2.2.0 ...
	I1219 06:12:15.629308 2064791 cli_runner.go:164] Run: docker network inspect functional-006924 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1219 06:12:15.645624 2064791 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1219 06:12:15.652379 2064791 out.go:179]   - apiserver.enable-admission-plugins=NamespaceAutoProvision
	I1219 06:12:15.655147 2064791 kubeadm.go:884] updating cluster {Name:functional-006924 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APIServerName:minikubeC
A APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMi
rror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1219 06:12:15.655272 2064791 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1219 06:12:15.655368 2064791 ssh_runner.go:195] Run: sudo crictl images --output json
	I1219 06:12:15.679674 2064791 containerd.go:627] all images are preloaded for containerd runtime.
	I1219 06:12:15.679686 2064791 containerd.go:534] Images already preloaded, skipping extraction
	I1219 06:12:15.679751 2064791 ssh_runner.go:195] Run: sudo crictl images --output json
	I1219 06:12:15.704545 2064791 containerd.go:627] all images are preloaded for containerd runtime.
	I1219 06:12:15.704557 2064791 cache_images.go:86] Images are preloaded, skipping loading
	I1219 06:12:15.704563 2064791 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-rc.1 containerd true true} ...
	I1219 06:12:15.704666 2064791 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-rc.1/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-006924 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1219 06:12:15.704733 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s info
	I1219 06:12:15.729671 2064791 extraconfig.go:125] Overwriting default enable-admission-plugins=NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota with user provided enable-admission-plugins=NamespaceAutoProvision for component apiserver
	I1219 06:12:15.729690 2064791 cni.go:84] Creating CNI manager for ""
	I1219 06:12:15.729697 2064791 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1219 06:12:15.729711 2064791 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1219 06:12:15.729738 2064791 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-rc.1 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-006924 NodeName:functional-006924 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceAutoProvision] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false Kubelet
ConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1219 06:12:15.729853 2064791 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-006924"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceAutoProvision"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-rc.1
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1219 06:12:15.729919 2064791 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-rc.1
	I1219 06:12:15.737786 2064791 binaries.go:51] Found k8s binaries, skipping transfer
	I1219 06:12:15.737845 2064791 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1219 06:12:15.745456 2064791 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (326 bytes)
	I1219 06:12:15.758378 2064791 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (357 bytes)
	I1219 06:12:15.775454 2064791 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2085 bytes)
	I1219 06:12:15.788878 2064791 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1219 06:12:15.792954 2064791 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1219 06:12:15.901526 2064791 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1219 06:12:16.150661 2064791 certs.go:69] Setting up /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924 for IP: 192.168.49.2
	I1219 06:12:16.150673 2064791 certs.go:195] generating shared ca certs ...
	I1219 06:12:16.150687 2064791 certs.go:227] acquiring lock for ca certs: {Name:mk382c71693ea4061363f97b153b21bf6cdf5f38 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1219 06:12:16.150828 2064791 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.key
	I1219 06:12:16.150868 2064791 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/proxy-client-ca.key
	I1219 06:12:16.150873 2064791 certs.go:257] generating profile certs ...
	I1219 06:12:16.150961 2064791 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.key
	I1219 06:12:16.151009 2064791 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.key.febe6fed
	I1219 06:12:16.151048 2064791 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/proxy-client.key
	I1219 06:12:16.151165 2064791 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/2000386.pem (1338 bytes)
	W1219 06:12:16.151195 2064791 certs.go:480] ignoring /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/2000386_empty.pem, impossibly tiny 0 bytes
	I1219 06:12:16.151202 2064791 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca-key.pem (1679 bytes)
	I1219 06:12:16.151230 2064791 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem (1078 bytes)
	I1219 06:12:16.151264 2064791 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/cert.pem (1123 bytes)
	I1219 06:12:16.151286 2064791 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/key.pem (1671 bytes)
	I1219 06:12:16.151329 2064791 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem (1708 bytes)
	I1219 06:12:16.151962 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1219 06:12:16.174202 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1219 06:12:16.194590 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1219 06:12:16.215085 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I1219 06:12:16.232627 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1219 06:12:16.250371 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I1219 06:12:16.267689 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1219 06:12:16.285522 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I1219 06:12:16.302837 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem --> /usr/share/ca-certificates/20003862.pem (1708 bytes)
	I1219 06:12:16.320411 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1219 06:12:16.337922 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/2000386.pem --> /usr/share/ca-certificates/2000386.pem (1338 bytes)
	I1219 06:12:16.355077 2064791 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (722 bytes)
	I1219 06:12:16.368122 2064791 ssh_runner.go:195] Run: openssl version
	I1219 06:12:16.374305 2064791 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:12:16.381720 2064791 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1219 06:12:16.389786 2064791 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:12:16.393456 2064791 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 19 05:43 /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:12:16.393514 2064791 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:12:16.434859 2064791 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1219 06:12:16.442942 2064791 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/2000386.pem
	I1219 06:12:16.450665 2064791 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/2000386.pem /etc/ssl/certs/2000386.pem
	I1219 06:12:16.458612 2064791 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/2000386.pem
	I1219 06:12:16.462545 2064791 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 19 05:57 /usr/share/ca-certificates/2000386.pem
	I1219 06:12:16.462603 2064791 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/2000386.pem
	I1219 06:12:16.503732 2064791 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1219 06:12:16.511394 2064791 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/20003862.pem
	I1219 06:12:16.519328 2064791 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/20003862.pem /etc/ssl/certs/20003862.pem
	I1219 06:12:16.526844 2064791 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/20003862.pem
	I1219 06:12:16.530487 2064791 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 19 05:57 /usr/share/ca-certificates/20003862.pem
	I1219 06:12:16.530547 2064791 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/20003862.pem
	I1219 06:12:16.571532 2064791 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1219 06:12:16.579524 2064791 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1219 06:12:16.583470 2064791 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1219 06:12:16.624483 2064791 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1219 06:12:16.665575 2064791 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1219 06:12:16.707109 2064791 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1219 06:12:16.749520 2064791 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1219 06:12:16.790988 2064791 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1219 06:12:16.831921 2064791 kubeadm.go:401] StartCluster: {Name:functional-006924 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APIServerName:minikubeCA A
PIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirro
r: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1219 06:12:16.832006 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1219 06:12:16.832084 2064791 ssh_runner.go:195] Run: sudo -s eval "crictl --timeout=10s ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1219 06:12:16.857771 2064791 cri.go:92] found id: ""
	I1219 06:12:16.857833 2064791 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1219 06:12:16.866091 2064791 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1219 06:12:16.866101 2064791 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1219 06:12:16.866158 2064791 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1219 06:12:16.873926 2064791 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1219 06:12:16.874482 2064791 kubeconfig.go:125] found "functional-006924" server: "https://192.168.49.2:8441"
	I1219 06:12:16.875731 2064791 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1219 06:12:16.883987 2064791 kubeadm.go:645] detected kubeadm config drift (will reconfigure cluster from new /var/tmp/minikube/kubeadm.yaml):
	-- stdout --
	--- /var/tmp/minikube/kubeadm.yaml	2025-12-19 05:57:41.594715365 +0000
	+++ /var/tmp/minikube/kubeadm.yaml.new	2025-12-19 06:12:15.784216685 +0000
	@@ -24,7 +24,7 @@
	   certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	   extraArgs:
	     - name: "enable-admission-plugins"
	-      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	+      value: "NamespaceAutoProvision"
	 controllerManager:
	   extraArgs:
	     - name: "allocate-node-cidrs"
	
	-- /stdout --
	I1219 06:12:16.884007 2064791 kubeadm.go:1161] stopping kube-system containers ...
	I1219 06:12:16.884018 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name: Namespaces:[kube-system]}
	I1219 06:12:16.884079 2064791 ssh_runner.go:195] Run: sudo -s eval "crictl --timeout=10s ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1219 06:12:16.914439 2064791 cri.go:92] found id: ""
	I1219 06:12:16.914509 2064791 ssh_runner.go:195] Run: sudo systemctl stop kubelet
	I1219 06:12:16.934128 2064791 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1219 06:12:16.942432 2064791 kubeadm.go:158] found existing configuration files:
	-rw------- 1 root root 5631 Dec 19 06:01 /etc/kubernetes/admin.conf
	-rw------- 1 root root 5640 Dec 19 06:01 /etc/kubernetes/controller-manager.conf
	-rw------- 1 root root 5672 Dec 19 06:01 /etc/kubernetes/kubelet.conf
	-rw------- 1 root root 5584 Dec 19 06:01 /etc/kubernetes/scheduler.conf
	
	I1219 06:12:16.942490 2064791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1219 06:12:16.950312 2064791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1219 06:12:16.957901 2064791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1219 06:12:16.957957 2064791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1219 06:12:16.965831 2064791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1219 06:12:16.973975 2064791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1219 06:12:16.974043 2064791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1219 06:12:16.981885 2064791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1219 06:12:16.989698 2064791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1219 06:12:16.989754 2064791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1219 06:12:16.997294 2064791 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1219 06:12:17.007519 2064791 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase certs all --config /var/tmp/minikube/kubeadm.yaml"
	I1219 06:12:17.060607 2064791 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml"
	I1219 06:12:18.829242 2064791 ssh_runner.go:235] Completed: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml": (1.768608779s)
	I1219 06:12:18.829304 2064791 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase kubelet-start --config /var/tmp/minikube/kubeadm.yaml"
	I1219 06:12:19.030093 2064791 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase control-plane all --config /var/tmp/minikube/kubeadm.yaml"
	I1219 06:12:19.096673 2064791 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase etcd local --config /var/tmp/minikube/kubeadm.yaml"
	I1219 06:12:19.143573 2064791 api_server.go:52] waiting for apiserver process to appear ...
	I1219 06:12:19.143640 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:19.643853 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:20.143947 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:20.643846 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:21.143937 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:21.644473 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:22.143865 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:22.643833 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:23.143826 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:23.644236 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:24.144477 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:24.643853 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:25.144064 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:25.643843 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:26.144063 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:26.644478 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:27.144296 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:27.644722 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:28.143844 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:28.643941 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:29.144786 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:29.644723 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:30.143963 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:30.644625 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:31.144751 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:31.643964 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:32.143826 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:32.644605 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:33.144436 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:33.644603 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:34.144557 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:34.643903 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:35.143857 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:35.644797 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:36.144741 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:36.644680 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:37.143872 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:37.643878 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:38.143792 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:38.643830 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:39.143968 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:39.644577 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:40.144282 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:40.643845 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:41.144575 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:41.644658 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:42.144382 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:42.643720 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:43.144137 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:43.644655 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:44.144500 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:44.643786 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:45.143923 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:45.643858 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:46.143983 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:46.644774 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:47.143914 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:47.644188 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:48.144565 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:48.644497 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:49.144557 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:49.644207 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:50.144291 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:50.644010 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:51.144161 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:51.644181 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:52.144353 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:52.644199 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:53.144589 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:53.644816 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:54.143901 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:54.643842 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:55.144726 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:55.643993 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:56.143828 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:56.644199 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:57.144581 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:57.644119 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:58.144258 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:58.643843 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:59.144738 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:59.644593 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:00.143794 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:00.643826 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:01.144533 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:01.644697 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:02.144643 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:02.644679 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:03.143834 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:03.644373 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:04.144520 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:04.643962 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:05.143832 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:05.644579 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:06.143874 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:06.643870 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:07.143982 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:07.644598 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:08.144752 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:08.643796 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:09.143951 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:09.644657 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:10.143751 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:10.643986 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:11.143782 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:11.644460 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:12.144317 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:12.644346 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:13.144670 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:13.643862 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:14.144550 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:14.644576 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:15.144673 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:15.644083 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:16.144204 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:16.644063 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:17.144669 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:17.643808 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:18.144068 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:18.643722 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:19.143899 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:19.143976 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:19.173119 2064791 cri.go:92] found id: ""
	I1219 06:13:19.173133 2064791 logs.go:282] 0 containers: []
	W1219 06:13:19.173141 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:19.173146 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:19.173204 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:19.207794 2064791 cri.go:92] found id: ""
	I1219 06:13:19.207807 2064791 logs.go:282] 0 containers: []
	W1219 06:13:19.207814 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:19.207819 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:19.207884 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:19.237060 2064791 cri.go:92] found id: ""
	I1219 06:13:19.237074 2064791 logs.go:282] 0 containers: []
	W1219 06:13:19.237081 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:19.237092 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:19.237154 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:19.262099 2064791 cri.go:92] found id: ""
	I1219 06:13:19.262114 2064791 logs.go:282] 0 containers: []
	W1219 06:13:19.262121 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:19.262126 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:19.262185 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:19.287540 2064791 cri.go:92] found id: ""
	I1219 06:13:19.287554 2064791 logs.go:282] 0 containers: []
	W1219 06:13:19.287561 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:19.287566 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:19.287632 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:19.315088 2064791 cri.go:92] found id: ""
	I1219 06:13:19.315102 2064791 logs.go:282] 0 containers: []
	W1219 06:13:19.315109 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:19.315115 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:19.315176 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:19.340777 2064791 cri.go:92] found id: ""
	I1219 06:13:19.340791 2064791 logs.go:282] 0 containers: []
	W1219 06:13:19.340798 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:19.340806 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:19.340818 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:19.357916 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:19.357932 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:19.426302 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:19.417866   10752 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:19.418382   10752 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:19.420072   10752 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:19.420408   10752 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:19.421854   10752 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:19.417866   10752 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:19.418382   10752 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:19.420072   10752 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:19.420408   10752 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:19.421854   10752 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:19.426313 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:19.426323 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:19.488347 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:19.488367 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:19.520211 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:19.520229 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:22.084930 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:22.095535 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:22.095602 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:22.122011 2064791 cri.go:92] found id: ""
	I1219 06:13:22.122025 2064791 logs.go:282] 0 containers: []
	W1219 06:13:22.122034 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:22.122059 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:22.122131 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:22.146880 2064791 cri.go:92] found id: ""
	I1219 06:13:22.146893 2064791 logs.go:282] 0 containers: []
	W1219 06:13:22.146900 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:22.146905 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:22.146975 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:22.176007 2064791 cri.go:92] found id: ""
	I1219 06:13:22.176021 2064791 logs.go:282] 0 containers: []
	W1219 06:13:22.176028 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:22.176033 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:22.176095 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:22.211343 2064791 cri.go:92] found id: ""
	I1219 06:13:22.211357 2064791 logs.go:282] 0 containers: []
	W1219 06:13:22.211365 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:22.211370 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:22.211429 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:22.235806 2064791 cri.go:92] found id: ""
	I1219 06:13:22.235829 2064791 logs.go:282] 0 containers: []
	W1219 06:13:22.235836 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:22.235841 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:22.235910 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:22.260858 2064791 cri.go:92] found id: ""
	I1219 06:13:22.260882 2064791 logs.go:282] 0 containers: []
	W1219 06:13:22.260888 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:22.260894 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:22.260954 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:22.285583 2064791 cri.go:92] found id: ""
	I1219 06:13:22.285597 2064791 logs.go:282] 0 containers: []
	W1219 06:13:22.285604 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:22.285613 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:22.285624 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:22.302970 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:22.302988 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:22.371208 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:22.362378   10854 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:22.363321   10854 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:22.365234   10854 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:22.365718   10854 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:22.367191   10854 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:22.362378   10854 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:22.363321   10854 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:22.365234   10854 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:22.365718   10854 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:22.367191   10854 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:22.371227 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:22.371238 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:22.433354 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:22.433373 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:22.468288 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:22.468305 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:25.028097 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:25.038266 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:25.038327 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:25.066109 2064791 cri.go:92] found id: ""
	I1219 06:13:25.066123 2064791 logs.go:282] 0 containers: []
	W1219 06:13:25.066130 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:25.066136 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:25.066199 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:25.091083 2064791 cri.go:92] found id: ""
	I1219 06:13:25.091096 2064791 logs.go:282] 0 containers: []
	W1219 06:13:25.091103 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:25.091109 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:25.091175 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:25.116729 2064791 cri.go:92] found id: ""
	I1219 06:13:25.116743 2064791 logs.go:282] 0 containers: []
	W1219 06:13:25.116750 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:25.116808 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:25.116890 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:25.145471 2064791 cri.go:92] found id: ""
	I1219 06:13:25.145485 2064791 logs.go:282] 0 containers: []
	W1219 06:13:25.145492 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:25.145497 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:25.145555 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:25.173780 2064791 cri.go:92] found id: ""
	I1219 06:13:25.173795 2064791 logs.go:282] 0 containers: []
	W1219 06:13:25.173801 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:25.173807 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:25.173876 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:25.202994 2064791 cri.go:92] found id: ""
	I1219 06:13:25.203008 2064791 logs.go:282] 0 containers: []
	W1219 06:13:25.203015 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:25.203021 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:25.203082 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:25.228548 2064791 cri.go:92] found id: ""
	I1219 06:13:25.228563 2064791 logs.go:282] 0 containers: []
	W1219 06:13:25.228570 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:25.228578 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:25.228590 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:25.260074 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:25.260090 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:25.316293 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:25.316311 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:25.333755 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:25.333771 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:25.395261 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:25.387061   10971 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:25.387481   10971 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:25.389091   10971 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:25.389803   10971 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:25.391447   10971 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:25.387061   10971 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:25.387481   10971 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:25.389091   10971 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:25.389803   10971 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:25.391447   10971 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:25.395273 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:25.395290 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:27.958003 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:27.968507 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:27.968571 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:27.994859 2064791 cri.go:92] found id: ""
	I1219 06:13:27.994872 2064791 logs.go:282] 0 containers: []
	W1219 06:13:27.994879 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:27.994884 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:27.994942 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:28.023716 2064791 cri.go:92] found id: ""
	I1219 06:13:28.023729 2064791 logs.go:282] 0 containers: []
	W1219 06:13:28.023736 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:28.023741 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:28.023807 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:28.048490 2064791 cri.go:92] found id: ""
	I1219 06:13:28.048504 2064791 logs.go:282] 0 containers: []
	W1219 06:13:28.048512 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:28.048517 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:28.048575 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:28.074305 2064791 cri.go:92] found id: ""
	I1219 06:13:28.074319 2064791 logs.go:282] 0 containers: []
	W1219 06:13:28.074326 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:28.074332 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:28.074392 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:28.098924 2064791 cri.go:92] found id: ""
	I1219 06:13:28.098938 2064791 logs.go:282] 0 containers: []
	W1219 06:13:28.098945 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:28.098950 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:28.099021 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:28.123000 2064791 cri.go:92] found id: ""
	I1219 06:13:28.123013 2064791 logs.go:282] 0 containers: []
	W1219 06:13:28.123021 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:28.123026 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:28.123091 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:28.150415 2064791 cri.go:92] found id: ""
	I1219 06:13:28.150428 2064791 logs.go:282] 0 containers: []
	W1219 06:13:28.150435 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:28.150443 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:28.150453 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:28.210763 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:28.210782 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:28.230191 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:28.230208 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:28.294389 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:28.286078   11066 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:28.286664   11066 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:28.288135   11066 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:28.288528   11066 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:28.290322   11066 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:28.286078   11066 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:28.286664   11066 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:28.288135   11066 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:28.288528   11066 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:28.290322   11066 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:28.294400 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:28.294411 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:28.357351 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:28.357371 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:30.888172 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:30.898614 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:30.898676 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:30.926377 2064791 cri.go:92] found id: ""
	I1219 06:13:30.926391 2064791 logs.go:282] 0 containers: []
	W1219 06:13:30.926398 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:30.926403 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:30.926458 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:30.950084 2064791 cri.go:92] found id: ""
	I1219 06:13:30.950097 2064791 logs.go:282] 0 containers: []
	W1219 06:13:30.950111 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:30.950117 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:30.950180 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:30.975713 2064791 cri.go:92] found id: ""
	I1219 06:13:30.975726 2064791 logs.go:282] 0 containers: []
	W1219 06:13:30.975734 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:30.975740 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:30.975798 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:31.012698 2064791 cri.go:92] found id: ""
	I1219 06:13:31.012712 2064791 logs.go:282] 0 containers: []
	W1219 06:13:31.012719 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:31.012725 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:31.012833 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:31.036945 2064791 cri.go:92] found id: ""
	I1219 06:13:31.036958 2064791 logs.go:282] 0 containers: []
	W1219 06:13:31.036965 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:31.036970 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:31.037028 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:31.062431 2064791 cri.go:92] found id: ""
	I1219 06:13:31.062445 2064791 logs.go:282] 0 containers: []
	W1219 06:13:31.062452 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:31.062457 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:31.062538 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:31.088075 2064791 cri.go:92] found id: ""
	I1219 06:13:31.088099 2064791 logs.go:282] 0 containers: []
	W1219 06:13:31.088106 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:31.088114 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:31.088123 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:31.143908 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:31.143928 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:31.164642 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:31.164661 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:31.241367 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:31.232734   11174 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:31.233582   11174 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:31.235302   11174 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:31.235873   11174 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:31.237342   11174 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:31.232734   11174 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:31.233582   11174 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:31.235302   11174 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:31.235873   11174 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:31.237342   11174 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:31.241378 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:31.241388 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:31.304583 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:31.304602 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:33.835874 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:33.847289 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:33.847350 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:33.874497 2064791 cri.go:92] found id: ""
	I1219 06:13:33.874511 2064791 logs.go:282] 0 containers: []
	W1219 06:13:33.874518 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:33.874523 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:33.874602 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:33.899113 2064791 cri.go:92] found id: ""
	I1219 06:13:33.899127 2064791 logs.go:282] 0 containers: []
	W1219 06:13:33.899134 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:33.899139 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:33.899198 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:33.927533 2064791 cri.go:92] found id: ""
	I1219 06:13:33.927546 2064791 logs.go:282] 0 containers: []
	W1219 06:13:33.927553 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:33.927559 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:33.927616 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:33.955150 2064791 cri.go:92] found id: ""
	I1219 06:13:33.955163 2064791 logs.go:282] 0 containers: []
	W1219 06:13:33.955170 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:33.955176 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:33.955233 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:33.979739 2064791 cri.go:92] found id: ""
	I1219 06:13:33.979753 2064791 logs.go:282] 0 containers: []
	W1219 06:13:33.979760 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:33.979765 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:33.979824 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:34.005264 2064791 cri.go:92] found id: ""
	I1219 06:13:34.005283 2064791 logs.go:282] 0 containers: []
	W1219 06:13:34.005291 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:34.005298 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:34.005375 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:34.031917 2064791 cri.go:92] found id: ""
	I1219 06:13:34.031931 2064791 logs.go:282] 0 containers: []
	W1219 06:13:34.031949 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:34.031958 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:34.031968 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:34.098907 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:34.098938 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:34.117494 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:34.117513 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:34.190606 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:34.181776   11272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:34.182594   11272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:34.184322   11272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:34.184963   11272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:34.186654   11272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:34.181776   11272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:34.182594   11272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:34.184322   11272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:34.184963   11272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:34.186654   11272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:34.190617 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:34.190630 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:34.260586 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:34.260607 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:36.792986 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:36.803226 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:36.803292 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:36.830943 2064791 cri.go:92] found id: ""
	I1219 06:13:36.830957 2064791 logs.go:282] 0 containers: []
	W1219 06:13:36.830964 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:36.830970 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:36.831029 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:36.856036 2064791 cri.go:92] found id: ""
	I1219 06:13:36.856051 2064791 logs.go:282] 0 containers: []
	W1219 06:13:36.856058 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:36.856063 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:36.856133 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:36.880807 2064791 cri.go:92] found id: ""
	I1219 06:13:36.880821 2064791 logs.go:282] 0 containers: []
	W1219 06:13:36.880828 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:36.880834 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:36.880893 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:36.904515 2064791 cri.go:92] found id: ""
	I1219 06:13:36.904529 2064791 logs.go:282] 0 containers: []
	W1219 06:13:36.904536 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:36.904542 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:36.904601 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:36.929517 2064791 cri.go:92] found id: ""
	I1219 06:13:36.929530 2064791 logs.go:282] 0 containers: []
	W1219 06:13:36.929538 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:36.929543 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:36.929615 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:36.953623 2064791 cri.go:92] found id: ""
	I1219 06:13:36.953636 2064791 logs.go:282] 0 containers: []
	W1219 06:13:36.953644 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:36.953650 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:36.953706 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:36.978769 2064791 cri.go:92] found id: ""
	I1219 06:13:36.978783 2064791 logs.go:282] 0 containers: []
	W1219 06:13:36.978790 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:36.978797 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:36.978807 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:37.036051 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:37.036072 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:37.053881 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:37.053898 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:37.117512 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:37.109460   11377 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:37.110050   11377 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:37.111554   11377 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:37.112021   11377 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:37.113512   11377 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:37.109460   11377 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:37.110050   11377 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:37.111554   11377 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:37.112021   11377 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:37.113512   11377 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:37.117523 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:37.117532 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:37.185580 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:37.185599 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:39.724185 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:39.735602 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:39.735670 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:39.760200 2064791 cri.go:92] found id: ""
	I1219 06:13:39.760214 2064791 logs.go:282] 0 containers: []
	W1219 06:13:39.760222 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:39.760227 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:39.760286 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:39.787416 2064791 cri.go:92] found id: ""
	I1219 06:13:39.787429 2064791 logs.go:282] 0 containers: []
	W1219 06:13:39.787437 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:39.787442 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:39.787505 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:39.811808 2064791 cri.go:92] found id: ""
	I1219 06:13:39.811822 2064791 logs.go:282] 0 containers: []
	W1219 06:13:39.811830 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:39.811836 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:39.811902 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:39.837072 2064791 cri.go:92] found id: ""
	I1219 06:13:39.837086 2064791 logs.go:282] 0 containers: []
	W1219 06:13:39.837093 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:39.837099 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:39.837200 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:39.866418 2064791 cri.go:92] found id: ""
	I1219 06:13:39.866432 2064791 logs.go:282] 0 containers: []
	W1219 06:13:39.866438 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:39.866444 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:39.866502 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:39.894744 2064791 cri.go:92] found id: ""
	I1219 06:13:39.894758 2064791 logs.go:282] 0 containers: []
	W1219 06:13:39.894765 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:39.894770 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:39.894833 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:39.921608 2064791 cri.go:92] found id: ""
	I1219 06:13:39.921622 2064791 logs.go:282] 0 containers: []
	W1219 06:13:39.921629 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:39.921643 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:39.921654 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:39.985200 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:39.985220 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:40.004064 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:40.004091 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:40.077619 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:40.069025   11483 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:40.069761   11483 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:40.071391   11483 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:40.072228   11483 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:40.073414   11483 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:40.069025   11483 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:40.069761   11483 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:40.071391   11483 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:40.072228   11483 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:40.073414   11483 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:40.077631 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:40.077641 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:40.142102 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:40.142127 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:42.682372 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:42.692608 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:42.692675 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:42.716749 2064791 cri.go:92] found id: ""
	I1219 06:13:42.716796 2064791 logs.go:282] 0 containers: []
	W1219 06:13:42.716804 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:42.716809 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:42.716888 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:42.740973 2064791 cri.go:92] found id: ""
	I1219 06:13:42.740986 2064791 logs.go:282] 0 containers: []
	W1219 06:13:42.740993 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:42.740999 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:42.741064 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:42.765521 2064791 cri.go:92] found id: ""
	I1219 06:13:42.765535 2064791 logs.go:282] 0 containers: []
	W1219 06:13:42.765543 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:42.765548 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:42.765607 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:42.790000 2064791 cri.go:92] found id: ""
	I1219 06:13:42.790015 2064791 logs.go:282] 0 containers: []
	W1219 06:13:42.790034 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:42.790040 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:42.790107 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:42.813722 2064791 cri.go:92] found id: ""
	I1219 06:13:42.813736 2064791 logs.go:282] 0 containers: []
	W1219 06:13:42.813743 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:42.813752 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:42.813814 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:42.838912 2064791 cri.go:92] found id: ""
	I1219 06:13:42.838926 2064791 logs.go:282] 0 containers: []
	W1219 06:13:42.838934 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:42.838939 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:42.839002 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:42.867044 2064791 cri.go:92] found id: ""
	I1219 06:13:42.867058 2064791 logs.go:282] 0 containers: []
	W1219 06:13:42.867065 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:42.867073 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:42.867083 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:42.923612 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:42.923632 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:42.941274 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:42.941293 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:43.008705 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:42.998396   11587 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:42.999004   11587 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:43.000468   11587 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:43.000919   11587 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:43.003777   11587 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:42.998396   11587 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:42.999004   11587 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:43.000468   11587 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:43.000919   11587 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:43.003777   11587 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:43.008716 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:43.008736 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:43.074629 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:43.074654 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:45.608725 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:45.619043 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:45.619107 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:45.645025 2064791 cri.go:92] found id: ""
	I1219 06:13:45.645041 2064791 logs.go:282] 0 containers: []
	W1219 06:13:45.645049 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:45.645054 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:45.645120 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:45.671700 2064791 cri.go:92] found id: ""
	I1219 06:13:45.671716 2064791 logs.go:282] 0 containers: []
	W1219 06:13:45.671723 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:45.671735 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:45.671797 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:45.701839 2064791 cri.go:92] found id: ""
	I1219 06:13:45.701864 2064791 logs.go:282] 0 containers: []
	W1219 06:13:45.701872 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:45.701878 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:45.701947 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:45.731819 2064791 cri.go:92] found id: ""
	I1219 06:13:45.731834 2064791 logs.go:282] 0 containers: []
	W1219 06:13:45.731841 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:45.731847 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:45.731910 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:45.758372 2064791 cri.go:92] found id: ""
	I1219 06:13:45.758386 2064791 logs.go:282] 0 containers: []
	W1219 06:13:45.758393 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:45.758399 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:45.758464 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:45.784713 2064791 cri.go:92] found id: ""
	I1219 06:13:45.784727 2064791 logs.go:282] 0 containers: []
	W1219 06:13:45.784734 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:45.784739 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:45.784829 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:45.811948 2064791 cri.go:92] found id: ""
	I1219 06:13:45.811962 2064791 logs.go:282] 0 containers: []
	W1219 06:13:45.811969 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:45.811977 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:45.811987 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:45.868299 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:45.868317 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:45.886032 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:45.886049 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:45.952733 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:45.944285   11690 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:45.944957   11690 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:45.946453   11690 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:45.946859   11690 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:45.948306   11690 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:45.944285   11690 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:45.944957   11690 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:45.946453   11690 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:45.946859   11690 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:45.948306   11690 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:45.952743 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:45.952783 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:46.020565 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:46.020588 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:48.550865 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:48.561408 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:48.561483 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:48.586776 2064791 cri.go:92] found id: ""
	I1219 06:13:48.586790 2064791 logs.go:282] 0 containers: []
	W1219 06:13:48.586797 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:48.586802 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:48.586864 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:48.612701 2064791 cri.go:92] found id: ""
	I1219 06:13:48.612715 2064791 logs.go:282] 0 containers: []
	W1219 06:13:48.612722 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:48.612727 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:48.612808 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:48.637097 2064791 cri.go:92] found id: ""
	I1219 06:13:48.637110 2064791 logs.go:282] 0 containers: []
	W1219 06:13:48.637118 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:48.637124 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:48.637183 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:48.662701 2064791 cri.go:92] found id: ""
	I1219 06:13:48.662715 2064791 logs.go:282] 0 containers: []
	W1219 06:13:48.662722 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:48.662727 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:48.662785 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:48.690291 2064791 cri.go:92] found id: ""
	I1219 06:13:48.690304 2064791 logs.go:282] 0 containers: []
	W1219 06:13:48.690311 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:48.690316 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:48.690376 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:48.715968 2064791 cri.go:92] found id: ""
	I1219 06:13:48.715983 2064791 logs.go:282] 0 containers: []
	W1219 06:13:48.715990 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:48.715995 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:48.716059 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:48.741069 2064791 cri.go:92] found id: ""
	I1219 06:13:48.741082 2064791 logs.go:282] 0 containers: []
	W1219 06:13:48.741090 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:48.741097 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:48.741113 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:48.796842 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:48.796863 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:48.814146 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:48.814166 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:48.879995 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:48.871133   11794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:48.871771   11794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:48.873597   11794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:48.874180   11794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:48.875997   11794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:48.871133   11794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:48.871771   11794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:48.873597   11794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:48.874180   11794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:48.875997   11794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:48.880005 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:48.880017 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:48.943211 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:48.943231 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:51.472961 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:51.483727 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:51.483805 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:51.513333 2064791 cri.go:92] found id: ""
	I1219 06:13:51.513347 2064791 logs.go:282] 0 containers: []
	W1219 06:13:51.513354 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:51.513360 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:51.513426 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:51.539359 2064791 cri.go:92] found id: ""
	I1219 06:13:51.539373 2064791 logs.go:282] 0 containers: []
	W1219 06:13:51.539380 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:51.539392 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:51.539449 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:51.564730 2064791 cri.go:92] found id: ""
	I1219 06:13:51.564743 2064791 logs.go:282] 0 containers: []
	W1219 06:13:51.564750 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:51.564794 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:51.564855 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:51.590117 2064791 cri.go:92] found id: ""
	I1219 06:13:51.590138 2064791 logs.go:282] 0 containers: []
	W1219 06:13:51.590145 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:51.590150 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:51.590210 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:51.614688 2064791 cri.go:92] found id: ""
	I1219 06:13:51.614702 2064791 logs.go:282] 0 containers: []
	W1219 06:13:51.614709 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:51.614715 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:51.614778 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:51.638492 2064791 cri.go:92] found id: ""
	I1219 06:13:51.638508 2064791 logs.go:282] 0 containers: []
	W1219 06:13:51.638518 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:51.638524 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:51.638597 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:51.666861 2064791 cri.go:92] found id: ""
	I1219 06:13:51.666874 2064791 logs.go:282] 0 containers: []
	W1219 06:13:51.666881 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:51.666888 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:51.666899 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:51.731208 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:51.723294   11895 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:51.723881   11895 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:51.725543   11895 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:51.725958   11895 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:51.727478   11895 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:51.723294   11895 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:51.723881   11895 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:51.725543   11895 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:51.725958   11895 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:51.727478   11895 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:51.731218 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:51.731228 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:51.793354 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:51.793375 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:51.819761 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:51.819784 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:51.877976 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:51.877996 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:54.395396 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:54.405788 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:54.405848 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:54.444121 2064791 cri.go:92] found id: ""
	I1219 06:13:54.444151 2064791 logs.go:282] 0 containers: []
	W1219 06:13:54.444159 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:54.444164 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:54.444243 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:54.471038 2064791 cri.go:92] found id: ""
	I1219 06:13:54.471064 2064791 logs.go:282] 0 containers: []
	W1219 06:13:54.471072 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:54.471077 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:54.471160 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:54.500364 2064791 cri.go:92] found id: ""
	I1219 06:13:54.500377 2064791 logs.go:282] 0 containers: []
	W1219 06:13:54.500385 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:54.500390 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:54.500450 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:54.525919 2064791 cri.go:92] found id: ""
	I1219 06:13:54.525934 2064791 logs.go:282] 0 containers: []
	W1219 06:13:54.525941 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:54.525962 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:54.526021 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:54.551211 2064791 cri.go:92] found id: ""
	I1219 06:13:54.551225 2064791 logs.go:282] 0 containers: []
	W1219 06:13:54.551232 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:54.551239 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:54.551310 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:54.577841 2064791 cri.go:92] found id: ""
	I1219 06:13:54.577854 2064791 logs.go:282] 0 containers: []
	W1219 06:13:54.577861 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:54.577866 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:54.577931 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:54.602636 2064791 cri.go:92] found id: ""
	I1219 06:13:54.602650 2064791 logs.go:282] 0 containers: []
	W1219 06:13:54.602656 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:54.602664 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:54.602675 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:54.619644 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:54.619661 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:54.682901 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:54.674199   12005 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:54.675104   12005 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:54.676718   12005 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:54.677329   12005 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:54.678988   12005 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:54.674199   12005 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:54.675104   12005 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:54.676718   12005 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:54.677329   12005 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:54.678988   12005 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:54.682911 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:54.682921 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:54.749370 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:54.749393 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:54.780731 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:54.780747 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:57.338712 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:57.349237 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:57.349299 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:57.378161 2064791 cri.go:92] found id: ""
	I1219 06:13:57.378175 2064791 logs.go:282] 0 containers: []
	W1219 06:13:57.378181 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:57.378187 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:57.378247 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:57.403073 2064791 cri.go:92] found id: ""
	I1219 06:13:57.403087 2064791 logs.go:282] 0 containers: []
	W1219 06:13:57.403094 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:57.403099 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:57.403160 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:57.431222 2064791 cri.go:92] found id: ""
	I1219 06:13:57.431236 2064791 logs.go:282] 0 containers: []
	W1219 06:13:57.431244 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:57.431249 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:57.431306 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:57.466943 2064791 cri.go:92] found id: ""
	I1219 06:13:57.466957 2064791 logs.go:282] 0 containers: []
	W1219 06:13:57.466964 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:57.466969 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:57.467027 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:57.493181 2064791 cri.go:92] found id: ""
	I1219 06:13:57.493194 2064791 logs.go:282] 0 containers: []
	W1219 06:13:57.493201 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:57.493206 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:57.493265 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:57.517521 2064791 cri.go:92] found id: ""
	I1219 06:13:57.517534 2064791 logs.go:282] 0 containers: []
	W1219 06:13:57.517543 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:57.517549 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:57.517606 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:57.546827 2064791 cri.go:92] found id: ""
	I1219 06:13:57.546841 2064791 logs.go:282] 0 containers: []
	W1219 06:13:57.546848 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:57.546856 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:57.546865 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:57.603521 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:57.603540 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:57.620971 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:57.620988 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:57.687316 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:57.678838   12107 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:57.679404   12107 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:57.680987   12107 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:57.681452   12107 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:57.683006   12107 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:57.678838   12107 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:57.679404   12107 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:57.680987   12107 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:57.681452   12107 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:57.683006   12107 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:57.687326 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:57.687336 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:57.759758 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:57.759787 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:00.293478 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:00.313120 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:00.313205 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:00.349921 2064791 cri.go:92] found id: ""
	I1219 06:14:00.349938 2064791 logs.go:282] 0 containers: []
	W1219 06:14:00.349947 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:00.349953 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:00.350031 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:00.381005 2064791 cri.go:92] found id: ""
	I1219 06:14:00.381022 2064791 logs.go:282] 0 containers: []
	W1219 06:14:00.381031 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:00.381037 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:00.381113 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:00.415179 2064791 cri.go:92] found id: ""
	I1219 06:14:00.415194 2064791 logs.go:282] 0 containers: []
	W1219 06:14:00.415202 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:00.415207 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:00.415268 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:00.455068 2064791 cri.go:92] found id: ""
	I1219 06:14:00.455084 2064791 logs.go:282] 0 containers: []
	W1219 06:14:00.455090 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:00.455096 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:00.455170 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:00.488360 2064791 cri.go:92] found id: ""
	I1219 06:14:00.488374 2064791 logs.go:282] 0 containers: []
	W1219 06:14:00.488382 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:00.488387 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:00.488450 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:00.514399 2064791 cri.go:92] found id: ""
	I1219 06:14:00.514414 2064791 logs.go:282] 0 containers: []
	W1219 06:14:00.514420 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:00.514426 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:00.514485 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:00.544639 2064791 cri.go:92] found id: ""
	I1219 06:14:00.544655 2064791 logs.go:282] 0 containers: []
	W1219 06:14:00.544662 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:00.544670 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:00.544683 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:00.562442 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:00.562459 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:00.630032 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:00.620824   12211 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:00.621603   12211 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:00.623426   12211 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:00.624012   12211 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:00.625740   12211 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:00.620824   12211 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:00.621603   12211 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:00.623426   12211 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:00.624012   12211 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:00.625740   12211 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:00.630043 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:00.630053 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:00.693056 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:00.693075 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:00.724344 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:00.724362 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:03.282407 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:03.292404 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:03.292463 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:03.322285 2064791 cri.go:92] found id: ""
	I1219 06:14:03.322298 2064791 logs.go:282] 0 containers: []
	W1219 06:14:03.322305 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:03.322310 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:03.322377 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:03.345824 2064791 cri.go:92] found id: ""
	I1219 06:14:03.345838 2064791 logs.go:282] 0 containers: []
	W1219 06:14:03.345846 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:03.345852 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:03.345913 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:03.369194 2064791 cri.go:92] found id: ""
	I1219 06:14:03.369208 2064791 logs.go:282] 0 containers: []
	W1219 06:14:03.369214 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:03.369220 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:03.369280 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:03.393453 2064791 cri.go:92] found id: ""
	I1219 06:14:03.393467 2064791 logs.go:282] 0 containers: []
	W1219 06:14:03.393474 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:03.393479 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:03.393538 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:03.423067 2064791 cri.go:92] found id: ""
	I1219 06:14:03.423082 2064791 logs.go:282] 0 containers: []
	W1219 06:14:03.423088 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:03.423093 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:03.423149 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:03.449404 2064791 cri.go:92] found id: ""
	I1219 06:14:03.449418 2064791 logs.go:282] 0 containers: []
	W1219 06:14:03.449424 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:03.449430 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:03.449491 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:03.483320 2064791 cri.go:92] found id: ""
	I1219 06:14:03.483334 2064791 logs.go:282] 0 containers: []
	W1219 06:14:03.483342 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:03.483349 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:03.483360 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:03.546816 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:03.538361   12314 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:03.539133   12314 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:03.540745   12314 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:03.541423   12314 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:03.542995   12314 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:03.538361   12314 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:03.539133   12314 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:03.540745   12314 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:03.541423   12314 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:03.542995   12314 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:03.546828 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:03.546840 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:03.608924 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:03.608943 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:03.640931 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:03.640947 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:03.698583 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:03.698601 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:06.217289 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:06.228468 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:06.228538 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:06.254249 2064791 cri.go:92] found id: ""
	I1219 06:14:06.254264 2064791 logs.go:282] 0 containers: []
	W1219 06:14:06.254271 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:06.254276 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:06.254335 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:06.278663 2064791 cri.go:92] found id: ""
	I1219 06:14:06.278677 2064791 logs.go:282] 0 containers: []
	W1219 06:14:06.278685 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:06.278691 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:06.278751 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:06.304128 2064791 cri.go:92] found id: ""
	I1219 06:14:06.304143 2064791 logs.go:282] 0 containers: []
	W1219 06:14:06.304150 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:06.304162 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:06.304224 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:06.330238 2064791 cri.go:92] found id: ""
	I1219 06:14:06.330252 2064791 logs.go:282] 0 containers: []
	W1219 06:14:06.330259 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:06.330265 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:06.330326 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:06.354219 2064791 cri.go:92] found id: ""
	I1219 06:14:06.354234 2064791 logs.go:282] 0 containers: []
	W1219 06:14:06.354241 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:06.354246 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:06.354307 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:06.382747 2064791 cri.go:92] found id: ""
	I1219 06:14:06.382762 2064791 logs.go:282] 0 containers: []
	W1219 06:14:06.382769 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:06.382777 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:06.382837 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:06.421656 2064791 cri.go:92] found id: ""
	I1219 06:14:06.421670 2064791 logs.go:282] 0 containers: []
	W1219 06:14:06.421677 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:06.421685 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:06.421694 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:06.498836 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:06.498857 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:06.531636 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:06.531653 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:06.590085 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:06.590106 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:06.608226 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:06.608243 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:06.675159 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:06.667629   12435 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:06.668042   12435 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:06.669562   12435 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:06.669908   12435 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:06.671371   12435 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:06.667629   12435 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:06.668042   12435 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:06.669562   12435 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:06.669908   12435 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:06.671371   12435 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:09.176005 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:09.186839 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:09.186916 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:09.211786 2064791 cri.go:92] found id: ""
	I1219 06:14:09.211800 2064791 logs.go:282] 0 containers: []
	W1219 06:14:09.211807 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:09.211812 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:09.211873 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:09.240415 2064791 cri.go:92] found id: ""
	I1219 06:14:09.240429 2064791 logs.go:282] 0 containers: []
	W1219 06:14:09.240436 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:09.240441 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:09.240503 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:09.266183 2064791 cri.go:92] found id: ""
	I1219 06:14:09.266197 2064791 logs.go:282] 0 containers: []
	W1219 06:14:09.266204 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:09.266209 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:09.266269 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:09.294483 2064791 cri.go:92] found id: ""
	I1219 06:14:09.294497 2064791 logs.go:282] 0 containers: []
	W1219 06:14:09.294504 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:09.294509 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:09.294572 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:09.319997 2064791 cri.go:92] found id: ""
	I1219 06:14:09.320011 2064791 logs.go:282] 0 containers: []
	W1219 06:14:09.320019 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:09.320024 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:09.320113 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:09.346661 2064791 cri.go:92] found id: ""
	I1219 06:14:09.346675 2064791 logs.go:282] 0 containers: []
	W1219 06:14:09.346683 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:09.346688 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:09.346746 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:09.371664 2064791 cri.go:92] found id: ""
	I1219 06:14:09.371690 2064791 logs.go:282] 0 containers: []
	W1219 06:14:09.371698 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:09.371706 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:09.371717 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:09.389515 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:09.389534 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:09.473775 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:09.465363   12518 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:09.465921   12518 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:09.467541   12518 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:09.468108   12518 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:09.469733   12518 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:09.465363   12518 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:09.465921   12518 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:09.467541   12518 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:09.468108   12518 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:09.469733   12518 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:09.473785 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:09.473796 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:09.541712 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:09.541736 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:09.577440 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:09.577456 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:12.133722 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:12.144214 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:12.144277 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:12.170929 2064791 cri.go:92] found id: ""
	I1219 06:14:12.170944 2064791 logs.go:282] 0 containers: []
	W1219 06:14:12.170951 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:12.170956 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:12.171026 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:12.195988 2064791 cri.go:92] found id: ""
	I1219 06:14:12.196002 2064791 logs.go:282] 0 containers: []
	W1219 06:14:12.196008 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:12.196014 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:12.196073 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:12.221254 2064791 cri.go:92] found id: ""
	I1219 06:14:12.221269 2064791 logs.go:282] 0 containers: []
	W1219 06:14:12.221276 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:12.221281 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:12.221346 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:12.246403 2064791 cri.go:92] found id: ""
	I1219 06:14:12.246417 2064791 logs.go:282] 0 containers: []
	W1219 06:14:12.246424 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:12.246430 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:12.246491 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:12.271124 2064791 cri.go:92] found id: ""
	I1219 06:14:12.271139 2064791 logs.go:282] 0 containers: []
	W1219 06:14:12.271145 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:12.271150 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:12.271209 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:12.296180 2064791 cri.go:92] found id: ""
	I1219 06:14:12.296194 2064791 logs.go:282] 0 containers: []
	W1219 06:14:12.296211 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:12.296216 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:12.296284 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:12.322520 2064791 cri.go:92] found id: ""
	I1219 06:14:12.322534 2064791 logs.go:282] 0 containers: []
	W1219 06:14:12.322541 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:12.322548 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:12.322559 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:12.349890 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:12.349907 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:12.407189 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:12.407210 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:12.426453 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:12.426469 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:12.499487 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:12.491549   12644 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:12.491978   12644 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:12.493525   12644 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:12.493875   12644 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:12.495402   12644 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:12.491549   12644 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:12.491978   12644 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:12.493525   12644 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:12.493875   12644 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:12.495402   12644 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:12.499498 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:12.499509 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:15.067160 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:15.078543 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:15.078611 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:15.104838 2064791 cri.go:92] found id: ""
	I1219 06:14:15.104852 2064791 logs.go:282] 0 containers: []
	W1219 06:14:15.104860 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:15.104865 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:15.104933 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:15.130179 2064791 cri.go:92] found id: ""
	I1219 06:14:15.130194 2064791 logs.go:282] 0 containers: []
	W1219 06:14:15.130201 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:15.130207 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:15.130268 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:15.156134 2064791 cri.go:92] found id: ""
	I1219 06:14:15.156147 2064791 logs.go:282] 0 containers: []
	W1219 06:14:15.156154 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:15.156159 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:15.156221 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:15.182543 2064791 cri.go:92] found id: ""
	I1219 06:14:15.182557 2064791 logs.go:282] 0 containers: []
	W1219 06:14:15.182564 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:15.182570 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:15.182631 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:15.212350 2064791 cri.go:92] found id: ""
	I1219 06:14:15.212364 2064791 logs.go:282] 0 containers: []
	W1219 06:14:15.212371 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:15.212376 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:15.212437 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:15.239403 2064791 cri.go:92] found id: ""
	I1219 06:14:15.239418 2064791 logs.go:282] 0 containers: []
	W1219 06:14:15.239425 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:15.239430 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:15.239490 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:15.265288 2064791 cri.go:92] found id: ""
	I1219 06:14:15.265303 2064791 logs.go:282] 0 containers: []
	W1219 06:14:15.265310 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:15.265318 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:15.265328 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:15.322825 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:15.322845 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:15.339946 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:15.339963 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:15.406282 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:15.394886   12732 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:15.395459   12732 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:15.397067   12732 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:15.397414   12732 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:15.399914   12732 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:15.394886   12732 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:15.395459   12732 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:15.397067   12732 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:15.397414   12732 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:15.399914   12732 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:15.406294 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:15.406305 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:15.481322 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:15.481342 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:18.011054 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:18.022305 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:18.022367 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:18.048236 2064791 cri.go:92] found id: ""
	I1219 06:14:18.048250 2064791 logs.go:282] 0 containers: []
	W1219 06:14:18.048257 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:18.048262 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:18.048326 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:18.075811 2064791 cri.go:92] found id: ""
	I1219 06:14:18.075825 2064791 logs.go:282] 0 containers: []
	W1219 06:14:18.075833 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:18.075839 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:18.075911 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:18.101578 2064791 cri.go:92] found id: ""
	I1219 06:14:18.101593 2064791 logs.go:282] 0 containers: []
	W1219 06:14:18.101601 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:18.101607 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:18.101668 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:18.127312 2064791 cri.go:92] found id: ""
	I1219 06:14:18.127327 2064791 logs.go:282] 0 containers: []
	W1219 06:14:18.127335 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:18.127341 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:18.127400 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:18.153616 2064791 cri.go:92] found id: ""
	I1219 06:14:18.153630 2064791 logs.go:282] 0 containers: []
	W1219 06:14:18.153637 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:18.153642 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:18.153702 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:18.177937 2064791 cri.go:92] found id: ""
	I1219 06:14:18.177959 2064791 logs.go:282] 0 containers: []
	W1219 06:14:18.177967 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:18.177972 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:18.178044 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:18.211563 2064791 cri.go:92] found id: ""
	I1219 06:14:18.211576 2064791 logs.go:282] 0 containers: []
	W1219 06:14:18.211583 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:18.211591 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:18.211614 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:18.270162 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:18.270182 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:18.288230 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:18.288247 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:18.351713 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:18.343431   12840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:18.343955   12840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:18.345552   12840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:18.346118   12840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:18.347689   12840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:18.343431   12840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:18.343955   12840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:18.345552   12840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:18.346118   12840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:18.347689   12840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:18.351723 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:18.351734 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:18.415359 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:18.415379 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:20.949383 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:20.959444 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:20.959504 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:20.984028 2064791 cri.go:92] found id: ""
	I1219 06:14:20.984041 2064791 logs.go:282] 0 containers: []
	W1219 06:14:20.984048 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:20.984054 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:20.984114 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:21.011129 2064791 cri.go:92] found id: ""
	I1219 06:14:21.011145 2064791 logs.go:282] 0 containers: []
	W1219 06:14:21.011153 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:21.011159 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:21.011232 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:21.036500 2064791 cri.go:92] found id: ""
	I1219 06:14:21.036515 2064791 logs.go:282] 0 containers: []
	W1219 06:14:21.036522 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:21.036528 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:21.036593 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:21.061075 2064791 cri.go:92] found id: ""
	I1219 06:14:21.061092 2064791 logs.go:282] 0 containers: []
	W1219 06:14:21.061099 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:21.061106 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:21.061164 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:21.086516 2064791 cri.go:92] found id: ""
	I1219 06:14:21.086532 2064791 logs.go:282] 0 containers: []
	W1219 06:14:21.086539 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:21.086545 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:21.086606 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:21.110771 2064791 cri.go:92] found id: ""
	I1219 06:14:21.110791 2064791 logs.go:282] 0 containers: []
	W1219 06:14:21.110798 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:21.110804 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:21.110861 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:21.135223 2064791 cri.go:92] found id: ""
	I1219 06:14:21.135237 2064791 logs.go:282] 0 containers: []
	W1219 06:14:21.135244 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:21.135253 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:21.135262 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:21.198022 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:21.198041 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:21.227058 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:21.227074 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:21.285376 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:21.285395 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:21.302978 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:21.302996 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:21.371361 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:21.363586   12960 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:21.364188   12960 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:21.365313   12960 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:21.365885   12960 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:21.367496   12960 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:21.363586   12960 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:21.364188   12960 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:21.365313   12960 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:21.365885   12960 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:21.367496   12960 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:23.871625 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:23.882253 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:23.882315 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:23.911700 2064791 cri.go:92] found id: ""
	I1219 06:14:23.911715 2064791 logs.go:282] 0 containers: []
	W1219 06:14:23.911722 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:23.911727 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:23.911792 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:23.940526 2064791 cri.go:92] found id: ""
	I1219 06:14:23.940542 2064791 logs.go:282] 0 containers: []
	W1219 06:14:23.940549 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:23.940554 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:23.940613 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:23.965505 2064791 cri.go:92] found id: ""
	I1219 06:14:23.965520 2064791 logs.go:282] 0 containers: []
	W1219 06:14:23.965527 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:23.965532 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:23.965592 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:23.990160 2064791 cri.go:92] found id: ""
	I1219 06:14:23.990174 2064791 logs.go:282] 0 containers: []
	W1219 06:14:23.990180 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:23.990186 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:23.990244 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:24.020703 2064791 cri.go:92] found id: ""
	I1219 06:14:24.020718 2064791 logs.go:282] 0 containers: []
	W1219 06:14:24.020731 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:24.020736 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:24.020818 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:24.045597 2064791 cri.go:92] found id: ""
	I1219 06:14:24.045611 2064791 logs.go:282] 0 containers: []
	W1219 06:14:24.045619 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:24.045625 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:24.045687 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:24.070650 2064791 cri.go:92] found id: ""
	I1219 06:14:24.070665 2064791 logs.go:282] 0 containers: []
	W1219 06:14:24.070673 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:24.070681 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:24.070692 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:24.088118 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:24.088135 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:24.154756 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:24.145796   13051 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:24.146235   13051 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:24.147863   13051 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:24.148542   13051 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:24.150277   13051 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:24.145796   13051 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:24.146235   13051 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:24.147863   13051 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:24.148542   13051 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:24.150277   13051 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:24.154766 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:24.154777 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:24.222682 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:24.222712 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:24.251017 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:24.251036 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:26.810547 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:26.821800 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:26.821882 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:26.851618 2064791 cri.go:92] found id: ""
	I1219 06:14:26.851632 2064791 logs.go:282] 0 containers: []
	W1219 06:14:26.851639 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:26.851644 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:26.851701 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:26.881247 2064791 cri.go:92] found id: ""
	I1219 06:14:26.881261 2064791 logs.go:282] 0 containers: []
	W1219 06:14:26.881268 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:26.881273 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:26.881331 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:26.906685 2064791 cri.go:92] found id: ""
	I1219 06:14:26.906698 2064791 logs.go:282] 0 containers: []
	W1219 06:14:26.906705 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:26.906710 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:26.906769 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:26.930800 2064791 cri.go:92] found id: ""
	I1219 06:14:26.930814 2064791 logs.go:282] 0 containers: []
	W1219 06:14:26.930821 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:26.930826 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:26.930886 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:26.955923 2064791 cri.go:92] found id: ""
	I1219 06:14:26.955936 2064791 logs.go:282] 0 containers: []
	W1219 06:14:26.955943 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:26.955949 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:26.956007 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:26.981009 2064791 cri.go:92] found id: ""
	I1219 06:14:26.981023 2064791 logs.go:282] 0 containers: []
	W1219 06:14:26.981030 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:26.981036 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:26.981100 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:27.008093 2064791 cri.go:92] found id: ""
	I1219 06:14:27.008107 2064791 logs.go:282] 0 containers: []
	W1219 06:14:27.008115 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:27.008123 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:27.008133 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:27.064465 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:27.064484 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:27.082027 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:27.082043 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:27.147050 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:27.138327   13160 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:27.139038   13160 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:27.140978   13160 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:27.141575   13160 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:27.143181   13160 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:27.138327   13160 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:27.139038   13160 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:27.140978   13160 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:27.141575   13160 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:27.143181   13160 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:27.147061 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:27.147072 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:27.209843 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:27.209866 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:29.744581 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:29.755392 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:29.755453 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:29.786638 2064791 cri.go:92] found id: ""
	I1219 06:14:29.786652 2064791 logs.go:282] 0 containers: []
	W1219 06:14:29.786659 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:29.786664 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:29.786724 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:29.812212 2064791 cri.go:92] found id: ""
	I1219 06:14:29.812225 2064791 logs.go:282] 0 containers: []
	W1219 06:14:29.812232 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:29.812237 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:29.812296 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:29.836877 2064791 cri.go:92] found id: ""
	I1219 06:14:29.836892 2064791 logs.go:282] 0 containers: []
	W1219 06:14:29.836899 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:29.836905 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:29.836964 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:29.861702 2064791 cri.go:92] found id: ""
	I1219 06:14:29.861715 2064791 logs.go:282] 0 containers: []
	W1219 06:14:29.861722 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:29.861727 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:29.861786 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:29.885680 2064791 cri.go:92] found id: ""
	I1219 06:14:29.885694 2064791 logs.go:282] 0 containers: []
	W1219 06:14:29.885703 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:29.885708 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:29.885770 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:29.910947 2064791 cri.go:92] found id: ""
	I1219 06:14:29.910961 2064791 logs.go:282] 0 containers: []
	W1219 06:14:29.910968 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:29.910973 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:29.911034 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:29.935050 2064791 cri.go:92] found id: ""
	I1219 06:14:29.935065 2064791 logs.go:282] 0 containers: []
	W1219 06:14:29.935072 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:29.935080 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:29.935090 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:29.998135 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:29.998156 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:30.043603 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:30.043622 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:30.105767 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:30.105788 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:30.123694 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:30.123713 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:30.194778 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:30.185852   13278 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:30.186558   13278 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:30.188234   13278 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:30.188924   13278 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:30.190530   13278 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:30.185852   13278 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:30.186558   13278 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:30.188234   13278 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:30.188924   13278 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:30.190530   13278 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:32.694996 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:32.706674 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:32.706732 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:32.732252 2064791 cri.go:92] found id: ""
	I1219 06:14:32.732268 2064791 logs.go:282] 0 containers: []
	W1219 06:14:32.732276 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:32.732282 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:32.732344 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:32.758653 2064791 cri.go:92] found id: ""
	I1219 06:14:32.758667 2064791 logs.go:282] 0 containers: []
	W1219 06:14:32.758674 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:32.758679 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:32.758739 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:32.784000 2064791 cri.go:92] found id: ""
	I1219 06:14:32.784015 2064791 logs.go:282] 0 containers: []
	W1219 06:14:32.784032 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:32.784037 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:32.784104 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:32.812817 2064791 cri.go:92] found id: ""
	I1219 06:14:32.812840 2064791 logs.go:282] 0 containers: []
	W1219 06:14:32.812847 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:32.812856 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:32.812927 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:32.838382 2064791 cri.go:92] found id: ""
	I1219 06:14:32.838396 2064791 logs.go:282] 0 containers: []
	W1219 06:14:32.838404 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:32.838409 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:32.838470 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:32.865911 2064791 cri.go:92] found id: ""
	I1219 06:14:32.865929 2064791 logs.go:282] 0 containers: []
	W1219 06:14:32.865937 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:32.865944 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:32.866010 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:32.890355 2064791 cri.go:92] found id: ""
	I1219 06:14:32.890369 2064791 logs.go:282] 0 containers: []
	W1219 06:14:32.890376 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:32.890384 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:32.890394 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:32.946230 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:32.946249 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:32.964055 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:32.964071 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:33.030318 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:33.021305   13371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:33.022105   13371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:33.023970   13371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:33.024656   13371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:33.026568   13371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:33.021305   13371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:33.022105   13371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:33.023970   13371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:33.024656   13371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:33.026568   13371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:33.030328 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:33.030341 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:33.097167 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:33.097188 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:35.628021 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:35.638217 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:35.638279 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:35.675187 2064791 cri.go:92] found id: ""
	I1219 06:14:35.675209 2064791 logs.go:282] 0 containers: []
	W1219 06:14:35.675217 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:35.675223 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:35.675283 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:35.703303 2064791 cri.go:92] found id: ""
	I1219 06:14:35.703317 2064791 logs.go:282] 0 containers: []
	W1219 06:14:35.703324 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:35.703329 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:35.703387 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:35.736481 2064791 cri.go:92] found id: ""
	I1219 06:14:35.736495 2064791 logs.go:282] 0 containers: []
	W1219 06:14:35.736502 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:35.736507 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:35.736571 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:35.761459 2064791 cri.go:92] found id: ""
	I1219 06:14:35.761472 2064791 logs.go:282] 0 containers: []
	W1219 06:14:35.761479 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:35.761485 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:35.761542 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:35.785228 2064791 cri.go:92] found id: ""
	I1219 06:14:35.785242 2064791 logs.go:282] 0 containers: []
	W1219 06:14:35.785249 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:35.785255 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:35.785317 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:35.811887 2064791 cri.go:92] found id: ""
	I1219 06:14:35.811901 2064791 logs.go:282] 0 containers: []
	W1219 06:14:35.811908 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:35.811913 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:35.811971 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:35.837382 2064791 cri.go:92] found id: ""
	I1219 06:14:35.837395 2064791 logs.go:282] 0 containers: []
	W1219 06:14:35.837402 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:35.837410 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:35.837420 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:35.893642 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:35.893663 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:35.911983 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:35.911999 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:35.979649 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:35.971161   13476 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:35.971848   13476 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:35.973453   13476 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:35.974018   13476 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:35.975638   13476 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:35.971161   13476 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:35.971848   13476 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:35.973453   13476 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:35.974018   13476 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:35.975638   13476 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:35.979659 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:35.979669 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:36.041989 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:36.042008 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:38.571113 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:38.581755 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:38.581829 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:38.606952 2064791 cri.go:92] found id: ""
	I1219 06:14:38.606977 2064791 logs.go:282] 0 containers: []
	W1219 06:14:38.606985 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:38.607000 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:38.607062 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:38.641457 2064791 cri.go:92] found id: ""
	I1219 06:14:38.641470 2064791 logs.go:282] 0 containers: []
	W1219 06:14:38.641477 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:38.641482 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:38.641544 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:38.675510 2064791 cri.go:92] found id: ""
	I1219 06:14:38.675523 2064791 logs.go:282] 0 containers: []
	W1219 06:14:38.675530 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:38.675536 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:38.675597 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:38.701888 2064791 cri.go:92] found id: ""
	I1219 06:14:38.701902 2064791 logs.go:282] 0 containers: []
	W1219 06:14:38.701909 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:38.701915 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:38.701975 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:38.728277 2064791 cri.go:92] found id: ""
	I1219 06:14:38.728290 2064791 logs.go:282] 0 containers: []
	W1219 06:14:38.728299 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:38.728305 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:38.728365 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:38.755404 2064791 cri.go:92] found id: ""
	I1219 06:14:38.755418 2064791 logs.go:282] 0 containers: []
	W1219 06:14:38.755427 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:38.755433 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:38.755495 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:38.778883 2064791 cri.go:92] found id: ""
	I1219 06:14:38.778896 2064791 logs.go:282] 0 containers: []
	W1219 06:14:38.778903 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:38.778911 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:38.778921 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:38.807023 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:38.807039 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:38.867198 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:38.867217 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:38.885283 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:38.885299 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:38.953980 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:38.945374   13594 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:38.946114   13594 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:38.947740   13594 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:38.948287   13594 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:38.949494   13594 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:38.945374   13594 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:38.946114   13594 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:38.947740   13594 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:38.948287   13594 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:38.949494   13594 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:38.953990 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:38.954002 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:41.516935 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:41.527938 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:41.528001 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:41.553242 2064791 cri.go:92] found id: ""
	I1219 06:14:41.553256 2064791 logs.go:282] 0 containers: []
	W1219 06:14:41.553263 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:41.553268 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:41.553333 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:41.579295 2064791 cri.go:92] found id: ""
	I1219 06:14:41.579309 2064791 logs.go:282] 0 containers: []
	W1219 06:14:41.579316 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:41.579321 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:41.579385 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:41.605144 2064791 cri.go:92] found id: ""
	I1219 06:14:41.605157 2064791 logs.go:282] 0 containers: []
	W1219 06:14:41.605164 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:41.605169 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:41.605237 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:41.629732 2064791 cri.go:92] found id: ""
	I1219 06:14:41.629747 2064791 logs.go:282] 0 containers: []
	W1219 06:14:41.629754 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:41.629760 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:41.629822 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:41.659346 2064791 cri.go:92] found id: ""
	I1219 06:14:41.659361 2064791 logs.go:282] 0 containers: []
	W1219 06:14:41.659368 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:41.659373 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:41.659432 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:41.690573 2064791 cri.go:92] found id: ""
	I1219 06:14:41.690598 2064791 logs.go:282] 0 containers: []
	W1219 06:14:41.690606 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:41.690612 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:41.690681 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:41.732984 2064791 cri.go:92] found id: ""
	I1219 06:14:41.732998 2064791 logs.go:282] 0 containers: []
	W1219 06:14:41.733006 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:41.733013 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:41.733023 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:41.795851 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:41.795871 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:41.825041 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:41.825056 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:41.886639 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:41.886659 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:41.904083 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:41.904100 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:41.971851 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:41.963475   13701 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:41.964169   13701 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:41.965774   13701 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:41.966365   13701 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:41.967995   13701 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:41.963475   13701 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:41.964169   13701 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:41.965774   13701 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:41.966365   13701 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:41.967995   13701 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:44.473271 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:44.483164 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:44.483222 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:44.511046 2064791 cri.go:92] found id: ""
	I1219 06:14:44.511060 2064791 logs.go:282] 0 containers: []
	W1219 06:14:44.511067 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:44.511072 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:44.511131 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:44.536197 2064791 cri.go:92] found id: ""
	I1219 06:14:44.536211 2064791 logs.go:282] 0 containers: []
	W1219 06:14:44.536219 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:44.536224 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:44.536283 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:44.562337 2064791 cri.go:92] found id: ""
	I1219 06:14:44.562354 2064791 logs.go:282] 0 containers: []
	W1219 06:14:44.562360 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:44.562366 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:44.562474 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:44.587553 2064791 cri.go:92] found id: ""
	I1219 06:14:44.587567 2064791 logs.go:282] 0 containers: []
	W1219 06:14:44.587574 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:44.587579 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:44.587637 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:44.614987 2064791 cri.go:92] found id: ""
	I1219 06:14:44.615000 2064791 logs.go:282] 0 containers: []
	W1219 06:14:44.615007 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:44.615012 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:44.615070 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:44.638714 2064791 cri.go:92] found id: ""
	I1219 06:14:44.638727 2064791 logs.go:282] 0 containers: []
	W1219 06:14:44.638734 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:44.638740 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:44.638800 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:44.688380 2064791 cri.go:92] found id: ""
	I1219 06:14:44.688393 2064791 logs.go:282] 0 containers: []
	W1219 06:14:44.688401 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:44.688409 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:44.688419 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:44.752969 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:44.752989 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:44.770407 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:44.770424 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:44.837420 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:44.829128   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:44.829667   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:44.831337   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:44.831916   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:44.833576   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:44.829128   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:44.829667   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:44.831337   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:44.831916   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:44.833576   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:44.837430 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:44.837440 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:44.899538 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:44.899557 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:47.426650 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:47.436749 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:47.436827 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:47.461986 2064791 cri.go:92] found id: ""
	I1219 06:14:47.462000 2064791 logs.go:282] 0 containers: []
	W1219 06:14:47.462007 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:47.462012 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:47.462071 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:47.487738 2064791 cri.go:92] found id: ""
	I1219 06:14:47.487765 2064791 logs.go:282] 0 containers: []
	W1219 06:14:47.487785 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:47.487790 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:47.487934 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:47.517657 2064791 cri.go:92] found id: ""
	I1219 06:14:47.517671 2064791 logs.go:282] 0 containers: []
	W1219 06:14:47.517678 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:47.517683 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:47.517741 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:47.541725 2064791 cri.go:92] found id: ""
	I1219 06:14:47.541740 2064791 logs.go:282] 0 containers: []
	W1219 06:14:47.541747 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:47.541752 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:47.541811 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:47.566613 2064791 cri.go:92] found id: ""
	I1219 06:14:47.566627 2064791 logs.go:282] 0 containers: []
	W1219 06:14:47.566634 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:47.566640 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:47.566698 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:47.593670 2064791 cri.go:92] found id: ""
	I1219 06:14:47.593683 2064791 logs.go:282] 0 containers: []
	W1219 06:14:47.593690 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:47.593705 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:47.593778 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:47.617501 2064791 cri.go:92] found id: ""
	I1219 06:14:47.617516 2064791 logs.go:282] 0 containers: []
	W1219 06:14:47.617523 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:47.617530 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:47.617544 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:47.699175 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:47.685609   13884 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:47.686090   13884 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:47.691538   13884 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:47.692419   13884 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:47.694959   13884 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:47.685609   13884 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:47.686090   13884 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:47.691538   13884 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:47.692419   13884 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:47.694959   13884 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:47.699185 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:47.699195 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:47.763955 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:47.763976 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:47.796195 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:47.796212 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:47.855457 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:47.855477 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:50.373913 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:50.384678 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:50.384743 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:50.409292 2064791 cri.go:92] found id: ""
	I1219 06:14:50.409305 2064791 logs.go:282] 0 containers: []
	W1219 06:14:50.409314 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:50.409319 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:50.409380 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:50.434622 2064791 cri.go:92] found id: ""
	I1219 06:14:50.434637 2064791 logs.go:282] 0 containers: []
	W1219 06:14:50.434644 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:50.434649 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:50.434708 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:50.462727 2064791 cri.go:92] found id: ""
	I1219 06:14:50.462741 2064791 logs.go:282] 0 containers: []
	W1219 06:14:50.462748 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:50.462754 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:50.462818 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:50.487565 2064791 cri.go:92] found id: ""
	I1219 06:14:50.487578 2064791 logs.go:282] 0 containers: []
	W1219 06:14:50.487586 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:50.487593 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:50.487655 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:50.514337 2064791 cri.go:92] found id: ""
	I1219 06:14:50.514351 2064791 logs.go:282] 0 containers: []
	W1219 06:14:50.514358 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:50.514363 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:50.514428 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:50.538808 2064791 cri.go:92] found id: ""
	I1219 06:14:50.538822 2064791 logs.go:282] 0 containers: []
	W1219 06:14:50.538829 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:50.538835 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:50.538900 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:50.562833 2064791 cri.go:92] found id: ""
	I1219 06:14:50.562847 2064791 logs.go:282] 0 containers: []
	W1219 06:14:50.562854 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:50.562862 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:50.562872 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:50.630176 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:50.621836   13988 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:50.622705   13988 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:50.624224   13988 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:50.624675   13988 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:50.626153   13988 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:50.621836   13988 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:50.622705   13988 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:50.624224   13988 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:50.624675   13988 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:50.626153   13988 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:50.630187 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:50.630197 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:50.701427 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:50.701449 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:50.729581 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:50.729602 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:50.786455 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:50.786479 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:53.304847 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:53.315504 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:53.315564 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:53.340157 2064791 cri.go:92] found id: ""
	I1219 06:14:53.340172 2064791 logs.go:282] 0 containers: []
	W1219 06:14:53.340179 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:53.340184 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:53.340242 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:53.368950 2064791 cri.go:92] found id: ""
	I1219 06:14:53.368964 2064791 logs.go:282] 0 containers: []
	W1219 06:14:53.368971 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:53.368976 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:53.369037 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:53.393336 2064791 cri.go:92] found id: ""
	I1219 06:14:53.393349 2064791 logs.go:282] 0 containers: []
	W1219 06:14:53.393356 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:53.393362 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:53.393419 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:53.417054 2064791 cri.go:92] found id: ""
	I1219 06:14:53.417069 2064791 logs.go:282] 0 containers: []
	W1219 06:14:53.417085 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:53.417091 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:53.417163 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:53.440932 2064791 cri.go:92] found id: ""
	I1219 06:14:53.440946 2064791 logs.go:282] 0 containers: []
	W1219 06:14:53.440953 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:53.440958 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:53.441016 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:53.464424 2064791 cri.go:92] found id: ""
	I1219 06:14:53.464437 2064791 logs.go:282] 0 containers: []
	W1219 06:14:53.464444 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:53.464449 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:53.464509 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:53.488126 2064791 cri.go:92] found id: ""
	I1219 06:14:53.488143 2064791 logs.go:282] 0 containers: []
	W1219 06:14:53.488150 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:53.488158 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:53.488168 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:53.558644 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:53.550747   14093 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:53.551416   14093 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:53.553060   14093 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:53.553386   14093 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:53.554859   14093 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:53.550747   14093 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:53.551416   14093 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:53.553060   14093 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:53.553386   14093 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:53.554859   14093 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:53.558655 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:53.558665 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:53.622193 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:53.622214 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:53.650744 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:53.650759 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:53.710733 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:53.710750 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:56.228553 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:56.238967 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:56.239030 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:56.262851 2064791 cri.go:92] found id: ""
	I1219 06:14:56.262864 2064791 logs.go:282] 0 containers: []
	W1219 06:14:56.262872 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:56.262877 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:56.262943 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:56.287030 2064791 cri.go:92] found id: ""
	I1219 06:14:56.287043 2064791 logs.go:282] 0 containers: []
	W1219 06:14:56.287050 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:56.287056 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:56.287118 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:56.312417 2064791 cri.go:92] found id: ""
	I1219 06:14:56.312430 2064791 logs.go:282] 0 containers: []
	W1219 06:14:56.312437 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:56.312442 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:56.312505 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:56.350599 2064791 cri.go:92] found id: ""
	I1219 06:14:56.350613 2064791 logs.go:282] 0 containers: []
	W1219 06:14:56.350622 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:56.350627 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:56.350686 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:56.374515 2064791 cri.go:92] found id: ""
	I1219 06:14:56.374528 2064791 logs.go:282] 0 containers: []
	W1219 06:14:56.374535 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:56.374540 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:56.374596 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:56.399267 2064791 cri.go:92] found id: ""
	I1219 06:14:56.399281 2064791 logs.go:282] 0 containers: []
	W1219 06:14:56.399288 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:56.399293 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:56.399351 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:56.424503 2064791 cri.go:92] found id: ""
	I1219 06:14:56.424516 2064791 logs.go:282] 0 containers: []
	W1219 06:14:56.424523 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:56.424531 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:56.424541 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:56.490954 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:56.490973 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:56.522329 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:56.522345 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:56.582279 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:56.582298 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:56.599656 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:56.599673 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:56.665092 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:56.656833   14220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:56.657522   14220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:56.659110   14220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:56.659433   14220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:56.661032   14220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:56.656833   14220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:56.657522   14220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:56.659110   14220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:56.659433   14220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:56.661032   14220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:59.165361 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:59.178705 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:59.178767 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:59.207415 2064791 cri.go:92] found id: ""
	I1219 06:14:59.207429 2064791 logs.go:282] 0 containers: []
	W1219 06:14:59.207436 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:59.207441 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:59.207499 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:59.231912 2064791 cri.go:92] found id: ""
	I1219 06:14:59.231926 2064791 logs.go:282] 0 containers: []
	W1219 06:14:59.231934 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:59.231939 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:59.232000 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:59.258822 2064791 cri.go:92] found id: ""
	I1219 06:14:59.258836 2064791 logs.go:282] 0 containers: []
	W1219 06:14:59.258843 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:59.258848 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:59.258909 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:59.283942 2064791 cri.go:92] found id: ""
	I1219 06:14:59.283955 2064791 logs.go:282] 0 containers: []
	W1219 06:14:59.283963 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:59.283968 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:59.284026 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:59.311236 2064791 cri.go:92] found id: ""
	I1219 06:14:59.311249 2064791 logs.go:282] 0 containers: []
	W1219 06:14:59.311256 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:59.311262 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:59.311322 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:59.336239 2064791 cri.go:92] found id: ""
	I1219 06:14:59.336253 2064791 logs.go:282] 0 containers: []
	W1219 06:14:59.336260 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:59.336267 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:59.336325 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:59.360395 2064791 cri.go:92] found id: ""
	I1219 06:14:59.360409 2064791 logs.go:282] 0 containers: []
	W1219 06:14:59.360417 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:59.360425 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:59.360435 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:59.423580 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:59.423601 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:59.453489 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:59.453506 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:59.512842 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:59.512862 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:59.530149 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:59.530168 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:59.593869 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:59.584731   14326 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:59.585448   14326 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:59.587088   14326 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:59.587675   14326 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:59.589312   14326 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:59.584731   14326 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:59.585448   14326 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:59.587088   14326 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:59.587675   14326 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:59.589312   14326 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:02.094126 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:02.104778 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:02.104839 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:02.129446 2064791 cri.go:92] found id: ""
	I1219 06:15:02.129462 2064791 logs.go:282] 0 containers: []
	W1219 06:15:02.129469 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:02.129474 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:02.129539 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:02.154875 2064791 cri.go:92] found id: ""
	I1219 06:15:02.154889 2064791 logs.go:282] 0 containers: []
	W1219 06:15:02.154896 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:02.154901 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:02.155006 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:02.180628 2064791 cri.go:92] found id: ""
	I1219 06:15:02.180643 2064791 logs.go:282] 0 containers: []
	W1219 06:15:02.180650 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:02.180655 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:02.180716 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:02.205447 2064791 cri.go:92] found id: ""
	I1219 06:15:02.205462 2064791 logs.go:282] 0 containers: []
	W1219 06:15:02.205469 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:02.205475 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:02.205543 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:02.233523 2064791 cri.go:92] found id: ""
	I1219 06:15:02.233537 2064791 logs.go:282] 0 containers: []
	W1219 06:15:02.233544 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:02.233550 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:02.233610 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:02.259723 2064791 cri.go:92] found id: ""
	I1219 06:15:02.259738 2064791 logs.go:282] 0 containers: []
	W1219 06:15:02.259744 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:02.259750 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:02.259813 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:02.289093 2064791 cri.go:92] found id: ""
	I1219 06:15:02.289108 2064791 logs.go:282] 0 containers: []
	W1219 06:15:02.289115 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:02.289123 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:02.289133 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:02.347737 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:02.347758 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:02.365547 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:02.365564 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:02.433606 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:02.424090   14417 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:02.425124   14417 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:02.425822   14417 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:02.427646   14417 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:02.428231   14417 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:02.424090   14417 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:02.425124   14417 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:02.425822   14417 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:02.427646   14417 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:02.428231   14417 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:02.433616 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:02.433627 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:02.497677 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:02.497697 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:05.027685 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:05.037775 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:05.037845 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:05.062132 2064791 cri.go:92] found id: ""
	I1219 06:15:05.062146 2064791 logs.go:282] 0 containers: []
	W1219 06:15:05.062152 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:05.062157 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:05.062230 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:05.087233 2064791 cri.go:92] found id: ""
	I1219 06:15:05.087247 2064791 logs.go:282] 0 containers: []
	W1219 06:15:05.087254 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:05.087259 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:05.087318 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:05.116140 2064791 cri.go:92] found id: ""
	I1219 06:15:05.116155 2064791 logs.go:282] 0 containers: []
	W1219 06:15:05.116162 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:05.116167 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:05.116229 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:05.141158 2064791 cri.go:92] found id: ""
	I1219 06:15:05.141171 2064791 logs.go:282] 0 containers: []
	W1219 06:15:05.141179 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:05.141184 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:05.141255 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:05.166033 2064791 cri.go:92] found id: ""
	I1219 06:15:05.166046 2064791 logs.go:282] 0 containers: []
	W1219 06:15:05.166053 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:05.166059 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:05.166118 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:05.189930 2064791 cri.go:92] found id: ""
	I1219 06:15:05.189943 2064791 logs.go:282] 0 containers: []
	W1219 06:15:05.189951 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:05.189956 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:05.190013 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:05.217697 2064791 cri.go:92] found id: ""
	I1219 06:15:05.217711 2064791 logs.go:282] 0 containers: []
	W1219 06:15:05.217718 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:05.217726 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:05.217737 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:05.273609 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:05.273629 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:05.291274 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:05.291291 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:05.355137 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:05.346587   14523 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:05.347461   14523 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:05.349261   14523 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:05.349738   14523 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:05.351338   14523 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:05.346587   14523 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:05.347461   14523 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:05.349261   14523 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:05.349738   14523 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:05.351338   14523 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:05.355147 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:05.355158 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:05.418376 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:05.418395 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:07.946932 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:07.957404 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:07.957465 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:07.983257 2064791 cri.go:92] found id: ""
	I1219 06:15:07.983270 2064791 logs.go:282] 0 containers: []
	W1219 06:15:07.983277 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:07.983283 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:07.983344 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:08.010747 2064791 cri.go:92] found id: ""
	I1219 06:15:08.010762 2064791 logs.go:282] 0 containers: []
	W1219 06:15:08.010770 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:08.010776 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:08.010842 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:08.040479 2064791 cri.go:92] found id: ""
	I1219 06:15:08.040493 2064791 logs.go:282] 0 containers: []
	W1219 06:15:08.040500 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:08.040506 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:08.040566 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:08.067147 2064791 cri.go:92] found id: ""
	I1219 06:15:08.067162 2064791 logs.go:282] 0 containers: []
	W1219 06:15:08.067169 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:08.067175 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:08.067238 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:08.096399 2064791 cri.go:92] found id: ""
	I1219 06:15:08.096415 2064791 logs.go:282] 0 containers: []
	W1219 06:15:08.096422 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:08.096430 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:08.096492 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:08.120924 2064791 cri.go:92] found id: ""
	I1219 06:15:08.120938 2064791 logs.go:282] 0 containers: []
	W1219 06:15:08.120945 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:08.120951 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:08.121010 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:08.145044 2064791 cri.go:92] found id: ""
	I1219 06:15:08.145057 2064791 logs.go:282] 0 containers: []
	W1219 06:15:08.145064 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:08.145072 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:08.145082 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:08.201643 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:08.201664 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:08.219150 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:08.219166 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:08.285100 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:08.276907   14628 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:08.277509   14628 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:08.279021   14628 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:08.279543   14628 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:08.281080   14628 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:08.276907   14628 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:08.277509   14628 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:08.279021   14628 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:08.279543   14628 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:08.281080   14628 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:08.285118 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:08.285129 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:08.349440 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:08.349460 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:10.878798 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:10.888854 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:10.888917 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:10.920436 2064791 cri.go:92] found id: ""
	I1219 06:15:10.920450 2064791 logs.go:282] 0 containers: []
	W1219 06:15:10.920457 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:10.920463 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:10.920536 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:10.951229 2064791 cri.go:92] found id: ""
	I1219 06:15:10.951243 2064791 logs.go:282] 0 containers: []
	W1219 06:15:10.951252 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:10.951258 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:10.951315 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:10.980039 2064791 cri.go:92] found id: ""
	I1219 06:15:10.980054 2064791 logs.go:282] 0 containers: []
	W1219 06:15:10.980061 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:10.980066 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:10.980126 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:11.008250 2064791 cri.go:92] found id: ""
	I1219 06:15:11.008265 2064791 logs.go:282] 0 containers: []
	W1219 06:15:11.008273 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:11.008278 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:11.008346 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:11.033554 2064791 cri.go:92] found id: ""
	I1219 06:15:11.033568 2064791 logs.go:282] 0 containers: []
	W1219 06:15:11.033575 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:11.033580 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:11.033641 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:11.058115 2064791 cri.go:92] found id: ""
	I1219 06:15:11.058128 2064791 logs.go:282] 0 containers: []
	W1219 06:15:11.058135 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:11.058141 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:11.058219 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:11.083222 2064791 cri.go:92] found id: ""
	I1219 06:15:11.083236 2064791 logs.go:282] 0 containers: []
	W1219 06:15:11.083242 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:11.083250 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:11.083260 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:11.146681 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:11.146702 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:11.176028 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:11.176047 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:11.233340 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:11.233361 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:11.250941 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:11.250957 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:11.315829 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:11.306797   14744 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:11.307411   14744 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:11.309263   14744 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:11.309846   14744 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:11.311388   14744 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:11.306797   14744 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:11.307411   14744 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:11.309263   14744 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:11.309846   14744 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:11.311388   14744 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:13.816114 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:13.826460 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:13.826527 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:13.850958 2064791 cri.go:92] found id: ""
	I1219 06:15:13.850973 2064791 logs.go:282] 0 containers: []
	W1219 06:15:13.850980 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:13.850988 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:13.851048 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:13.879518 2064791 cri.go:92] found id: ""
	I1219 06:15:13.879538 2064791 logs.go:282] 0 containers: []
	W1219 06:15:13.879546 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:13.879551 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:13.879611 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:13.917876 2064791 cri.go:92] found id: ""
	I1219 06:15:13.917890 2064791 logs.go:282] 0 containers: []
	W1219 06:15:13.917897 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:13.917902 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:13.917965 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:13.957039 2064791 cri.go:92] found id: ""
	I1219 06:15:13.957053 2064791 logs.go:282] 0 containers: []
	W1219 06:15:13.957060 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:13.957065 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:13.957126 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:13.992398 2064791 cri.go:92] found id: ""
	I1219 06:15:13.992412 2064791 logs.go:282] 0 containers: []
	W1219 06:15:13.992419 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:13.992424 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:13.992486 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:14.019915 2064791 cri.go:92] found id: ""
	I1219 06:15:14.019930 2064791 logs.go:282] 0 containers: []
	W1219 06:15:14.019938 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:14.019943 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:14.020004 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:14.045800 2064791 cri.go:92] found id: ""
	I1219 06:15:14.045815 2064791 logs.go:282] 0 containers: []
	W1219 06:15:14.045822 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:14.045830 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:14.045841 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:14.102453 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:14.102472 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:14.120093 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:14.120110 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:14.183187 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:14.175289   14833 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:14.175797   14833 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:14.177338   14833 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:14.177777   14833 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:14.179247   14833 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:14.175289   14833 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:14.175797   14833 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:14.177338   14833 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:14.177777   14833 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:14.179247   14833 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:14.183198 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:14.183209 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:14.246652 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:14.246673 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:16.780257 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:16.790741 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:16.790802 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:16.815777 2064791 cri.go:92] found id: ""
	I1219 06:15:16.815802 2064791 logs.go:282] 0 containers: []
	W1219 06:15:16.815809 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:16.815815 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:16.815890 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:16.841105 2064791 cri.go:92] found id: ""
	I1219 06:15:16.841124 2064791 logs.go:282] 0 containers: []
	W1219 06:15:16.841142 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:16.841148 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:16.841217 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:16.866795 2064791 cri.go:92] found id: ""
	I1219 06:15:16.866820 2064791 logs.go:282] 0 containers: []
	W1219 06:15:16.866827 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:16.866833 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:16.866910 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:16.892692 2064791 cri.go:92] found id: ""
	I1219 06:15:16.892706 2064791 logs.go:282] 0 containers: []
	W1219 06:15:16.892713 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:16.892718 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:16.892803 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:16.926258 2064791 cri.go:92] found id: ""
	I1219 06:15:16.926272 2064791 logs.go:282] 0 containers: []
	W1219 06:15:16.926279 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:16.926285 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:16.926346 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:16.955968 2064791 cri.go:92] found id: ""
	I1219 06:15:16.955982 2064791 logs.go:282] 0 containers: []
	W1219 06:15:16.955989 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:16.955995 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:16.956057 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:16.985158 2064791 cri.go:92] found id: ""
	I1219 06:15:16.985172 2064791 logs.go:282] 0 containers: []
	W1219 06:15:16.985179 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:16.985186 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:16.985196 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:17.043879 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:17.043899 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:17.061599 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:17.061616 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:17.125509 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:17.117153   14937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:17.117733   14937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:17.119480   14937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:17.119896   14937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:17.121399   14937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:17.117153   14937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:17.117733   14937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:17.119480   14937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:17.119896   14937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:17.121399   14937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:17.125519 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:17.125531 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:17.189339 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:17.189359 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:19.721517 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:19.731846 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:19.731916 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:19.758133 2064791 cri.go:92] found id: ""
	I1219 06:15:19.758147 2064791 logs.go:282] 0 containers: []
	W1219 06:15:19.758154 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:19.758160 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:19.758228 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:19.787023 2064791 cri.go:92] found id: ""
	I1219 06:15:19.787037 2064791 logs.go:282] 0 containers: []
	W1219 06:15:19.787045 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:19.787059 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:19.787123 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:19.813855 2064791 cri.go:92] found id: ""
	I1219 06:15:19.813869 2064791 logs.go:282] 0 containers: []
	W1219 06:15:19.813876 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:19.813881 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:19.813944 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:19.838418 2064791 cri.go:92] found id: ""
	I1219 06:15:19.838432 2064791 logs.go:282] 0 containers: []
	W1219 06:15:19.838439 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:19.838444 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:19.838508 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:19.863215 2064791 cri.go:92] found id: ""
	I1219 06:15:19.863229 2064791 logs.go:282] 0 containers: []
	W1219 06:15:19.863240 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:19.863246 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:19.863307 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:19.887732 2064791 cri.go:92] found id: ""
	I1219 06:15:19.887746 2064791 logs.go:282] 0 containers: []
	W1219 06:15:19.887753 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:19.887758 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:19.887815 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:19.930174 2064791 cri.go:92] found id: ""
	I1219 06:15:19.930192 2064791 logs.go:282] 0 containers: []
	W1219 06:15:19.930200 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:19.930208 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:19.930222 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:19.949025 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:19.949041 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:20.022932 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:20.013526   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:20.014350   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:20.016121   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:20.016702   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:20.018424   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:20.013526   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:20.014350   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:20.016121   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:20.016702   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:20.018424   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:20.022944 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:20.022955 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:20.088903 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:20.088924 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:20.117778 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:20.117794 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:22.677536 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:22.687468 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:22.687536 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:22.712714 2064791 cri.go:92] found id: ""
	I1219 06:15:22.712728 2064791 logs.go:282] 0 containers: []
	W1219 06:15:22.712736 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:22.712741 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:22.712816 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:22.736316 2064791 cri.go:92] found id: ""
	I1219 06:15:22.736329 2064791 logs.go:282] 0 containers: []
	W1219 06:15:22.736336 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:22.736341 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:22.736401 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:22.762215 2064791 cri.go:92] found id: ""
	I1219 06:15:22.762229 2064791 logs.go:282] 0 containers: []
	W1219 06:15:22.762236 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:22.762241 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:22.762309 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:22.787061 2064791 cri.go:92] found id: ""
	I1219 06:15:22.787074 2064791 logs.go:282] 0 containers: []
	W1219 06:15:22.787081 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:22.787086 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:22.787146 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:22.814937 2064791 cri.go:92] found id: ""
	I1219 06:15:22.814951 2064791 logs.go:282] 0 containers: []
	W1219 06:15:22.814957 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:22.814963 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:22.815033 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:22.842839 2064791 cri.go:92] found id: ""
	I1219 06:15:22.842853 2064791 logs.go:282] 0 containers: []
	W1219 06:15:22.842859 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:22.842865 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:22.842923 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:22.869394 2064791 cri.go:92] found id: ""
	I1219 06:15:22.869407 2064791 logs.go:282] 0 containers: []
	W1219 06:15:22.869413 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:22.869421 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:22.869430 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:22.926492 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:22.926510 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:22.944210 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:22.944232 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:23.013797 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:23.003493   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:23.005070   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:23.006087   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:23.007846   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:23.008447   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:23.003493   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:23.005070   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:23.006087   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:23.007846   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:23.008447   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:23.013807 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:23.013821 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:23.081279 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:23.081306 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:25.612946 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:25.622887 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:25.622947 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:25.656332 2064791 cri.go:92] found id: ""
	I1219 06:15:25.656346 2064791 logs.go:282] 0 containers: []
	W1219 06:15:25.656353 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:25.656359 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:25.656425 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:25.680887 2064791 cri.go:92] found id: ""
	I1219 06:15:25.680901 2064791 logs.go:282] 0 containers: []
	W1219 06:15:25.680908 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:25.680913 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:25.680981 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:25.705508 2064791 cri.go:92] found id: ""
	I1219 06:15:25.705523 2064791 logs.go:282] 0 containers: []
	W1219 06:15:25.705531 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:25.705536 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:25.705598 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:25.729434 2064791 cri.go:92] found id: ""
	I1219 06:15:25.729447 2064791 logs.go:282] 0 containers: []
	W1219 06:15:25.729454 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:25.729459 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:25.729517 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:25.755351 2064791 cri.go:92] found id: ""
	I1219 06:15:25.755365 2064791 logs.go:282] 0 containers: []
	W1219 06:15:25.755381 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:25.755388 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:25.755449 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:25.782840 2064791 cri.go:92] found id: ""
	I1219 06:15:25.782854 2064791 logs.go:282] 0 containers: []
	W1219 06:15:25.782861 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:25.782866 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:25.782929 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:25.811125 2064791 cri.go:92] found id: ""
	I1219 06:15:25.811139 2064791 logs.go:282] 0 containers: []
	W1219 06:15:25.811155 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:25.811165 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:25.811175 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:25.867579 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:25.867601 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:25.884977 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:25.884996 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:25.983099 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:25.974919   15250 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:25.975374   15250 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:25.977076   15250 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:25.977555   15250 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:25.979165   15250 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:25.974919   15250 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:25.975374   15250 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:25.977076   15250 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:25.977555   15250 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:25.979165   15250 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:25.983110 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:25.983119 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:26.047515 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:26.047534 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:28.576468 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:28.586983 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:28.587044 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:28.612243 2064791 cri.go:92] found id: ""
	I1219 06:15:28.612257 2064791 logs.go:282] 0 containers: []
	W1219 06:15:28.612264 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:28.612270 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:28.612331 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:28.637476 2064791 cri.go:92] found id: ""
	I1219 06:15:28.637490 2064791 logs.go:282] 0 containers: []
	W1219 06:15:28.637496 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:28.637502 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:28.637564 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:28.662778 2064791 cri.go:92] found id: ""
	I1219 06:15:28.662792 2064791 logs.go:282] 0 containers: []
	W1219 06:15:28.662800 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:28.662805 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:28.662864 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:28.687078 2064791 cri.go:92] found id: ""
	I1219 06:15:28.687091 2064791 logs.go:282] 0 containers: []
	W1219 06:15:28.687098 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:28.687105 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:28.687166 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:28.712552 2064791 cri.go:92] found id: ""
	I1219 06:15:28.712566 2064791 logs.go:282] 0 containers: []
	W1219 06:15:28.712572 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:28.712577 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:28.712646 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:28.738798 2064791 cri.go:92] found id: ""
	I1219 06:15:28.738812 2064791 logs.go:282] 0 containers: []
	W1219 06:15:28.738819 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:28.738824 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:28.738881 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:28.767309 2064791 cri.go:92] found id: ""
	I1219 06:15:28.767324 2064791 logs.go:282] 0 containers: []
	W1219 06:15:28.767340 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:28.767349 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:28.767358 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:28.827489 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:28.827509 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:28.844978 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:28.844994 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:28.915425 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:28.906948   15355 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:28.907778   15355 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:28.909411   15355 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:28.909881   15355 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:28.911514   15355 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:28.906948   15355 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:28.907778   15355 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:28.909411   15355 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:28.909881   15355 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:28.911514   15355 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:28.915435 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:28.915445 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:28.980721 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:28.980742 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:31.518692 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:31.528660 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:31.528719 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:31.551685 2064791 cri.go:92] found id: ""
	I1219 06:15:31.551699 2064791 logs.go:282] 0 containers: []
	W1219 06:15:31.551706 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:31.551711 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:31.551772 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:31.578616 2064791 cri.go:92] found id: ""
	I1219 06:15:31.578631 2064791 logs.go:282] 0 containers: []
	W1219 06:15:31.578637 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:31.578643 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:31.578703 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:31.602562 2064791 cri.go:92] found id: ""
	I1219 06:15:31.602576 2064791 logs.go:282] 0 containers: []
	W1219 06:15:31.602582 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:31.602588 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:31.602646 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:31.626697 2064791 cri.go:92] found id: ""
	I1219 06:15:31.626711 2064791 logs.go:282] 0 containers: []
	W1219 06:15:31.626718 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:31.626723 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:31.626786 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:31.650705 2064791 cri.go:92] found id: ""
	I1219 06:15:31.650718 2064791 logs.go:282] 0 containers: []
	W1219 06:15:31.650725 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:31.650730 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:31.650791 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:31.675292 2064791 cri.go:92] found id: ""
	I1219 06:15:31.675305 2064791 logs.go:282] 0 containers: []
	W1219 06:15:31.675312 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:31.675318 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:31.675390 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:31.699969 2064791 cri.go:92] found id: ""
	I1219 06:15:31.699993 2064791 logs.go:282] 0 containers: []
	W1219 06:15:31.700000 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:31.700008 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:31.700018 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:31.765728 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:31.765750 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:31.793450 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:31.793466 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:31.849244 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:31.849262 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:31.866467 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:31.866483 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:31.960156 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:31.933314   15472 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:31.948921   15472 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:31.949480   15472 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:31.951697   15472 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:31.951968   15472 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:31.933314   15472 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:31.948921   15472 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:31.949480   15472 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:31.951697   15472 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:31.951968   15472 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:34.460923 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:34.473072 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:34.473134 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:34.498075 2064791 cri.go:92] found id: ""
	I1219 06:15:34.498089 2064791 logs.go:282] 0 containers: []
	W1219 06:15:34.498097 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:34.498103 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:34.498162 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:34.522785 2064791 cri.go:92] found id: ""
	I1219 06:15:34.522800 2064791 logs.go:282] 0 containers: []
	W1219 06:15:34.522807 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:34.522812 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:34.522871 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:34.550566 2064791 cri.go:92] found id: ""
	I1219 06:15:34.550580 2064791 logs.go:282] 0 containers: []
	W1219 06:15:34.550587 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:34.550592 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:34.550651 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:34.579586 2064791 cri.go:92] found id: ""
	I1219 06:15:34.579600 2064791 logs.go:282] 0 containers: []
	W1219 06:15:34.579607 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:34.579612 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:34.579670 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:34.606248 2064791 cri.go:92] found id: ""
	I1219 06:15:34.606261 2064791 logs.go:282] 0 containers: []
	W1219 06:15:34.606269 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:34.606274 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:34.606335 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:34.634419 2064791 cri.go:92] found id: ""
	I1219 06:15:34.634433 2064791 logs.go:282] 0 containers: []
	W1219 06:15:34.634440 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:34.634446 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:34.634509 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:34.658438 2064791 cri.go:92] found id: ""
	I1219 06:15:34.658451 2064791 logs.go:282] 0 containers: []
	W1219 06:15:34.658458 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:34.658465 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:34.658475 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:34.675933 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:34.675950 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:34.740273 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:34.732883   15564 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:34.733297   15564 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:34.734737   15564 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:34.735051   15564 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:34.736480   15564 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:34.732883   15564 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:34.733297   15564 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:34.734737   15564 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:34.735051   15564 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:34.736480   15564 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:34.740283 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:34.740293 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:34.802357 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:34.802378 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:34.833735 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:34.833751 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:37.390170 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:37.400300 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:37.400358 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:37.425095 2064791 cri.go:92] found id: ""
	I1219 06:15:37.425110 2064791 logs.go:282] 0 containers: []
	W1219 06:15:37.425117 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:37.425122 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:37.425178 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:37.451178 2064791 cri.go:92] found id: ""
	I1219 06:15:37.451192 2064791 logs.go:282] 0 containers: []
	W1219 06:15:37.451199 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:37.451205 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:37.451273 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:37.475828 2064791 cri.go:92] found id: ""
	I1219 06:15:37.475842 2064791 logs.go:282] 0 containers: []
	W1219 06:15:37.475848 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:37.475854 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:37.475911 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:37.499474 2064791 cri.go:92] found id: ""
	I1219 06:15:37.499488 2064791 logs.go:282] 0 containers: []
	W1219 06:15:37.499494 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:37.499500 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:37.499563 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:37.523636 2064791 cri.go:92] found id: ""
	I1219 06:15:37.523649 2064791 logs.go:282] 0 containers: []
	W1219 06:15:37.523656 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:37.523662 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:37.523720 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:37.547846 2064791 cri.go:92] found id: ""
	I1219 06:15:37.547859 2064791 logs.go:282] 0 containers: []
	W1219 06:15:37.547868 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:37.547873 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:37.547929 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:37.574766 2064791 cri.go:92] found id: ""
	I1219 06:15:37.574780 2064791 logs.go:282] 0 containers: []
	W1219 06:15:37.574787 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:37.574795 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:37.574805 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:37.601905 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:37.601923 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:37.657564 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:37.657584 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:37.674777 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:37.674793 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:37.736918 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:37.728853   15679 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:37.729594   15679 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:37.731160   15679 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:37.731519   15679 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:37.733094   15679 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:37.728853   15679 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:37.729594   15679 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:37.731160   15679 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:37.731519   15679 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:37.733094   15679 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:37.736928 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:37.736939 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:40.303769 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:40.313854 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:40.313919 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:40.338505 2064791 cri.go:92] found id: ""
	I1219 06:15:40.338519 2064791 logs.go:282] 0 containers: []
	W1219 06:15:40.338527 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:40.338532 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:40.338594 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:40.363391 2064791 cri.go:92] found id: ""
	I1219 06:15:40.363405 2064791 logs.go:282] 0 containers: []
	W1219 06:15:40.363412 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:40.363417 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:40.363476 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:40.389092 2064791 cri.go:92] found id: ""
	I1219 06:15:40.389105 2064791 logs.go:282] 0 containers: []
	W1219 06:15:40.389113 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:40.389118 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:40.389184 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:40.412993 2064791 cri.go:92] found id: ""
	I1219 06:15:40.413007 2064791 logs.go:282] 0 containers: []
	W1219 06:15:40.413014 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:40.413022 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:40.413087 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:40.438530 2064791 cri.go:92] found id: ""
	I1219 06:15:40.438544 2064791 logs.go:282] 0 containers: []
	W1219 06:15:40.438550 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:40.438556 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:40.438617 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:40.462221 2064791 cri.go:92] found id: ""
	I1219 06:15:40.462235 2064791 logs.go:282] 0 containers: []
	W1219 06:15:40.462242 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:40.462248 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:40.462310 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:40.487125 2064791 cri.go:92] found id: ""
	I1219 06:15:40.487139 2064791 logs.go:282] 0 containers: []
	W1219 06:15:40.487146 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:40.487155 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:40.487165 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:40.543163 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:40.543184 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:40.560362 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:40.560379 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:40.627130 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:40.619309   15773 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:40.619937   15773 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:40.621423   15773 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:40.621900   15773 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:40.623348   15773 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:40.619309   15773 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:40.619937   15773 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:40.621423   15773 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:40.621900   15773 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:40.623348   15773 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:40.627139 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:40.627149 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:40.689654 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:40.689673 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:43.219338 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:43.229544 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:43.229607 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:43.253914 2064791 cri.go:92] found id: ""
	I1219 06:15:43.253935 2064791 logs.go:282] 0 containers: []
	W1219 06:15:43.253941 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:43.253947 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:43.254007 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:43.279019 2064791 cri.go:92] found id: ""
	I1219 06:15:43.279033 2064791 logs.go:282] 0 containers: []
	W1219 06:15:43.279040 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:43.279045 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:43.279106 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:43.304187 2064791 cri.go:92] found id: ""
	I1219 06:15:43.304202 2064791 logs.go:282] 0 containers: []
	W1219 06:15:43.304209 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:43.304216 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:43.304275 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:43.327938 2064791 cri.go:92] found id: ""
	I1219 06:15:43.327951 2064791 logs.go:282] 0 containers: []
	W1219 06:15:43.327958 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:43.327963 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:43.328027 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:43.356864 2064791 cri.go:92] found id: ""
	I1219 06:15:43.356878 2064791 logs.go:282] 0 containers: []
	W1219 06:15:43.356885 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:43.356891 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:43.356958 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:43.381050 2064791 cri.go:92] found id: ""
	I1219 06:15:43.381063 2064791 logs.go:282] 0 containers: []
	W1219 06:15:43.381070 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:43.381076 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:43.381138 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:43.404804 2064791 cri.go:92] found id: ""
	I1219 06:15:43.404818 2064791 logs.go:282] 0 containers: []
	W1219 06:15:43.404825 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:43.404832 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:43.404857 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:43.470026 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:43.461361   15871 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:43.461922   15871 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:43.463417   15871 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:43.464514   15871 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:43.465204   15871 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:43.461361   15871 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:43.461922   15871 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:43.463417   15871 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:43.464514   15871 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:43.465204   15871 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:43.470036 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:43.470050 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:43.533067 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:43.533086 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:43.560074 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:43.560097 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:43.618564 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:43.618582 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:46.135866 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:46.146429 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:46.146493 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:46.180564 2064791 cri.go:92] found id: ""
	I1219 06:15:46.180578 2064791 logs.go:282] 0 containers: []
	W1219 06:15:46.180595 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:46.180601 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:46.180669 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:46.208067 2064791 cri.go:92] found id: ""
	I1219 06:15:46.208081 2064791 logs.go:282] 0 containers: []
	W1219 06:15:46.208087 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:46.208100 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:46.208159 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:46.234676 2064791 cri.go:92] found id: ""
	I1219 06:15:46.234692 2064791 logs.go:282] 0 containers: []
	W1219 06:15:46.234703 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:46.234709 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:46.234775 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:46.259673 2064791 cri.go:92] found id: ""
	I1219 06:15:46.259686 2064791 logs.go:282] 0 containers: []
	W1219 06:15:46.259693 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:46.259707 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:46.259765 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:46.286964 2064791 cri.go:92] found id: ""
	I1219 06:15:46.286979 2064791 logs.go:282] 0 containers: []
	W1219 06:15:46.286986 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:46.286992 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:46.287056 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:46.312785 2064791 cri.go:92] found id: ""
	I1219 06:15:46.312800 2064791 logs.go:282] 0 containers: []
	W1219 06:15:46.312807 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:46.312813 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:46.312875 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:46.339250 2064791 cri.go:92] found id: ""
	I1219 06:15:46.339264 2064791 logs.go:282] 0 containers: []
	W1219 06:15:46.339271 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:46.339279 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:46.339290 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:46.368113 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:46.368129 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:46.423008 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:46.423029 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:46.440481 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:46.440503 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:46.504270 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:46.496670   15994 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:46.497191   15994 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:46.498736   15994 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:46.499181   15994 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:46.500593   15994 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:46.496670   15994 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:46.497191   15994 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:46.498736   15994 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:46.499181   15994 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:46.500593   15994 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:46.504280 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:46.504291 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:49.065736 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:49.075993 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:49.076057 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:49.102714 2064791 cri.go:92] found id: ""
	I1219 06:15:49.102729 2064791 logs.go:282] 0 containers: []
	W1219 06:15:49.102736 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:49.102741 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:49.102808 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:49.131284 2064791 cri.go:92] found id: ""
	I1219 06:15:49.131297 2064791 logs.go:282] 0 containers: []
	W1219 06:15:49.131323 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:49.131328 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:49.131398 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:49.166942 2064791 cri.go:92] found id: ""
	I1219 06:15:49.166955 2064791 logs.go:282] 0 containers: []
	W1219 06:15:49.166962 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:49.166968 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:49.167036 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:49.204412 2064791 cri.go:92] found id: ""
	I1219 06:15:49.204425 2064791 logs.go:282] 0 containers: []
	W1219 06:15:49.204444 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:49.204450 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:49.204522 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:49.232351 2064791 cri.go:92] found id: ""
	I1219 06:15:49.232364 2064791 logs.go:282] 0 containers: []
	W1219 06:15:49.232371 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:49.232377 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:49.232434 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:49.257013 2064791 cri.go:92] found id: ""
	I1219 06:15:49.257028 2064791 logs.go:282] 0 containers: []
	W1219 06:15:49.257046 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:49.257052 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:49.257112 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:49.282354 2064791 cri.go:92] found id: ""
	I1219 06:15:49.282368 2064791 logs.go:282] 0 containers: []
	W1219 06:15:49.282375 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:49.282384 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:49.282396 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:49.351742 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:49.342325   16079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:49.343272   16079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:49.345051   16079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:49.345596   16079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:49.347231   16079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:49.342325   16079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:49.343272   16079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:49.345051   16079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:49.345596   16079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:49.347231   16079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:49.351753 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:49.351764 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:49.416971 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:49.416991 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:49.445804 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:49.445819 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:49.503988 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:49.504006 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:52.023309 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:52.034750 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:52.034819 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:52.061000 2064791 cri.go:92] found id: ""
	I1219 06:15:52.061014 2064791 logs.go:282] 0 containers: []
	W1219 06:15:52.061021 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:52.061026 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:52.061084 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:52.086949 2064791 cri.go:92] found id: ""
	I1219 06:15:52.086964 2064791 logs.go:282] 0 containers: []
	W1219 06:15:52.086971 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:52.086977 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:52.087048 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:52.112534 2064791 cri.go:92] found id: ""
	I1219 06:15:52.112549 2064791 logs.go:282] 0 containers: []
	W1219 06:15:52.112556 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:52.112562 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:52.112635 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:52.137132 2064791 cri.go:92] found id: ""
	I1219 06:15:52.137146 2064791 logs.go:282] 0 containers: []
	W1219 06:15:52.137154 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:52.137160 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:52.137221 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:52.191157 2064791 cri.go:92] found id: ""
	I1219 06:15:52.191171 2064791 logs.go:282] 0 containers: []
	W1219 06:15:52.191178 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:52.191184 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:52.191245 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:52.220921 2064791 cri.go:92] found id: ""
	I1219 06:15:52.220936 2064791 logs.go:282] 0 containers: []
	W1219 06:15:52.220942 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:52.220948 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:52.221009 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:52.250645 2064791 cri.go:92] found id: ""
	I1219 06:15:52.250658 2064791 logs.go:282] 0 containers: []
	W1219 06:15:52.250665 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:52.250673 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:52.250684 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:52.306199 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:52.306222 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:52.323553 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:52.323570 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:52.386634 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:52.378311   16188 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:52.379023   16188 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:52.380829   16188 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:52.381330   16188 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:52.382864   16188 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:52.378311   16188 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:52.379023   16188 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:52.380829   16188 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:52.381330   16188 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:52.382864   16188 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:52.386643 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:52.386653 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:52.450135 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:52.450155 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:54.981347 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:54.991806 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:54.991864 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:55.028687 2064791 cri.go:92] found id: ""
	I1219 06:15:55.028702 2064791 logs.go:282] 0 containers: []
	W1219 06:15:55.028709 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:55.028714 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:55.028797 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:55.053716 2064791 cri.go:92] found id: ""
	I1219 06:15:55.053730 2064791 logs.go:282] 0 containers: []
	W1219 06:15:55.053737 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:55.053784 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:55.053857 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:55.080935 2064791 cri.go:92] found id: ""
	I1219 06:15:55.080949 2064791 logs.go:282] 0 containers: []
	W1219 06:15:55.080957 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:55.080962 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:55.081027 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:55.109910 2064791 cri.go:92] found id: ""
	I1219 06:15:55.109925 2064791 logs.go:282] 0 containers: []
	W1219 06:15:55.109932 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:55.109938 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:55.110005 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:55.138372 2064791 cri.go:92] found id: ""
	I1219 06:15:55.138386 2064791 logs.go:282] 0 containers: []
	W1219 06:15:55.138393 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:55.138400 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:55.138463 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:55.172107 2064791 cri.go:92] found id: ""
	I1219 06:15:55.172121 2064791 logs.go:282] 0 containers: []
	W1219 06:15:55.172128 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:55.172133 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:55.172191 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:55.207670 2064791 cri.go:92] found id: ""
	I1219 06:15:55.207684 2064791 logs.go:282] 0 containers: []
	W1219 06:15:55.207690 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:55.207698 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:55.207708 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:55.273955 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:55.273975 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:55.303942 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:55.303960 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:55.367492 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:55.367517 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:55.384909 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:55.384933 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:55.447954 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:55.439722   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:55.440407   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:55.442003   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:55.442525   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:55.444025   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:55.439722   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:55.440407   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:55.442003   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:55.442525   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:55.444025   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:57.948746 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:57.959024 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:57.959084 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:57.984251 2064791 cri.go:92] found id: ""
	I1219 06:15:57.984264 2064791 logs.go:282] 0 containers: []
	W1219 06:15:57.984271 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:57.984277 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:57.984335 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:58.012444 2064791 cri.go:92] found id: ""
	I1219 06:15:58.012459 2064791 logs.go:282] 0 containers: []
	W1219 06:15:58.012467 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:58.012472 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:58.012531 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:58.040674 2064791 cri.go:92] found id: ""
	I1219 06:15:58.040688 2064791 logs.go:282] 0 containers: []
	W1219 06:15:58.040695 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:58.040700 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:58.040783 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:58.066507 2064791 cri.go:92] found id: ""
	I1219 06:15:58.066522 2064791 logs.go:282] 0 containers: []
	W1219 06:15:58.066529 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:58.066535 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:58.066598 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:58.095594 2064791 cri.go:92] found id: ""
	I1219 06:15:58.095608 2064791 logs.go:282] 0 containers: []
	W1219 06:15:58.095615 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:58.095620 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:58.095680 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:58.121624 2064791 cri.go:92] found id: ""
	I1219 06:15:58.121638 2064791 logs.go:282] 0 containers: []
	W1219 06:15:58.121644 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:58.121650 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:58.121707 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:58.149741 2064791 cri.go:92] found id: ""
	I1219 06:15:58.149755 2064791 logs.go:282] 0 containers: []
	W1219 06:15:58.149762 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:58.149770 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:58.149782 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:58.181272 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:58.181288 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:58.240957 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:58.240987 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:58.258044 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:58.258060 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:58.322228 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:58.314484   16407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:58.315163   16407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:58.316700   16407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:58.317166   16407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:58.318619   16407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:58.314484   16407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:58.315163   16407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:58.316700   16407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:58.317166   16407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:58.318619   16407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:58.322239 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:58.322250 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:16:00.885057 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:16:00.895320 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:16:00.895386 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:16:00.919880 2064791 cri.go:92] found id: ""
	I1219 06:16:00.919914 2064791 logs.go:282] 0 containers: []
	W1219 06:16:00.919922 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:16:00.919927 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:16:00.919995 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:16:00.944225 2064791 cri.go:92] found id: ""
	I1219 06:16:00.944238 2064791 logs.go:282] 0 containers: []
	W1219 06:16:00.944245 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:16:00.944250 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:16:00.944316 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:16:00.969895 2064791 cri.go:92] found id: ""
	I1219 06:16:00.969909 2064791 logs.go:282] 0 containers: []
	W1219 06:16:00.969916 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:16:00.969921 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:16:00.969982 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:16:00.994103 2064791 cri.go:92] found id: ""
	I1219 06:16:00.994118 2064791 logs.go:282] 0 containers: []
	W1219 06:16:00.994134 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:16:00.994141 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:16:00.994224 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:16:01.021151 2064791 cri.go:92] found id: ""
	I1219 06:16:01.021166 2064791 logs.go:282] 0 containers: []
	W1219 06:16:01.021172 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:16:01.021181 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:16:01.021244 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:16:01.046747 2064791 cri.go:92] found id: ""
	I1219 06:16:01.046761 2064791 logs.go:282] 0 containers: []
	W1219 06:16:01.046768 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:16:01.046773 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:16:01.046831 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:16:01.071655 2064791 cri.go:92] found id: ""
	I1219 06:16:01.071672 2064791 logs.go:282] 0 containers: []
	W1219 06:16:01.071679 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:16:01.071686 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:16:01.071696 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:16:01.127618 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:16:01.127636 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:16:01.145631 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:16:01.145650 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:16:01.235681 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:16:01.226719   16501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:01.227442   16501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:01.229160   16501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:01.229807   16501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:01.231513   16501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:16:01.226719   16501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:01.227442   16501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:01.229160   16501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:01.229807   16501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:01.231513   16501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:16:01.235691 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:16:01.235703 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:16:01.299234 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:16:01.299254 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:16:03.829050 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:16:03.839364 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:16:03.839436 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:16:03.871023 2064791 cri.go:92] found id: ""
	I1219 06:16:03.871037 2064791 logs.go:282] 0 containers: []
	W1219 06:16:03.871044 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:16:03.871049 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:16:03.871107 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:16:03.895774 2064791 cri.go:92] found id: ""
	I1219 06:16:03.895788 2064791 logs.go:282] 0 containers: []
	W1219 06:16:03.895795 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:16:03.895800 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:16:03.895859 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:16:03.921890 2064791 cri.go:92] found id: ""
	I1219 06:16:03.921904 2064791 logs.go:282] 0 containers: []
	W1219 06:16:03.921911 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:16:03.921916 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:16:03.921978 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:16:03.946705 2064791 cri.go:92] found id: ""
	I1219 06:16:03.946719 2064791 logs.go:282] 0 containers: []
	W1219 06:16:03.946726 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:16:03.946731 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:16:03.946790 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:16:03.972566 2064791 cri.go:92] found id: ""
	I1219 06:16:03.972579 2064791 logs.go:282] 0 containers: []
	W1219 06:16:03.972605 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:16:03.972610 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:16:03.972676 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:16:03.998217 2064791 cri.go:92] found id: ""
	I1219 06:16:03.998232 2064791 logs.go:282] 0 containers: []
	W1219 06:16:03.998239 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:16:03.998245 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:16:03.998311 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:16:04.024748 2064791 cri.go:92] found id: ""
	I1219 06:16:04.024786 2064791 logs.go:282] 0 containers: []
	W1219 06:16:04.024793 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:16:04.024802 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:16:04.024827 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:16:04.089385 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:16:04.089406 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:16:04.120677 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:16:04.120695 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:16:04.178263 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:16:04.178282 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:16:04.201672 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:16:04.201688 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:16:04.272543 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:16:04.263798   16619 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:04.264930   16619 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:04.265587   16619 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:04.267210   16619 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:04.267480   16619 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:16:04.263798   16619 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:04.264930   16619 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:04.265587   16619 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:04.267210   16619 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:04.267480   16619 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:16:06.772819 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:16:06.784042 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:16:06.784119 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:16:06.809087 2064791 cri.go:92] found id: ""
	I1219 06:16:06.809101 2064791 logs.go:282] 0 containers: []
	W1219 06:16:06.809108 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:16:06.809113 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:16:06.809171 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:16:06.833636 2064791 cri.go:92] found id: ""
	I1219 06:16:06.833649 2064791 logs.go:282] 0 containers: []
	W1219 06:16:06.833656 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:16:06.833661 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:16:06.833726 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:16:06.862766 2064791 cri.go:92] found id: ""
	I1219 06:16:06.862781 2064791 logs.go:282] 0 containers: []
	W1219 06:16:06.862788 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:16:06.862797 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:16:06.862858 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:16:06.887915 2064791 cri.go:92] found id: ""
	I1219 06:16:06.887929 2064791 logs.go:282] 0 containers: []
	W1219 06:16:06.887935 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:16:06.887940 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:16:06.888001 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:16:06.913093 2064791 cri.go:92] found id: ""
	I1219 06:16:06.913107 2064791 logs.go:282] 0 containers: []
	W1219 06:16:06.913114 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:16:06.913119 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:16:06.913184 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:16:06.944662 2064791 cri.go:92] found id: ""
	I1219 06:16:06.944677 2064791 logs.go:282] 0 containers: []
	W1219 06:16:06.944695 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:16:06.944700 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:16:06.944796 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:16:06.976908 2064791 cri.go:92] found id: ""
	I1219 06:16:06.976923 2064791 logs.go:282] 0 containers: []
	W1219 06:16:06.976929 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:16:06.976937 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:16:06.976948 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:16:07.041844 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:16:07.041865 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:16:07.071749 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:16:07.071765 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:16:07.130039 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:16:07.130060 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:16:07.147749 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:16:07.147766 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:16:07.226540 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:16:07.218267   16726 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:07.218857   16726 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:07.220373   16726 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:07.220937   16726 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:07.222460   16726 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:16:07.218267   16726 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:07.218857   16726 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:07.220373   16726 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:07.220937   16726 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:07.222460   16726 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:16:09.726802 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:16:09.737347 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:16:09.737408 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:16:09.761740 2064791 cri.go:92] found id: ""
	I1219 06:16:09.761754 2064791 logs.go:282] 0 containers: []
	W1219 06:16:09.761761 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:16:09.761767 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:16:09.761838 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:16:09.787861 2064791 cri.go:92] found id: ""
	I1219 06:16:09.787876 2064791 logs.go:282] 0 containers: []
	W1219 06:16:09.787883 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:16:09.787888 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:16:09.787950 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:16:09.812599 2064791 cri.go:92] found id: ""
	I1219 06:16:09.812613 2064791 logs.go:282] 0 containers: []
	W1219 06:16:09.812620 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:16:09.812625 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:16:09.812687 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:16:09.837573 2064791 cri.go:92] found id: ""
	I1219 06:16:09.837588 2064791 logs.go:282] 0 containers: []
	W1219 06:16:09.837596 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:16:09.837601 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:16:09.837661 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:16:09.861697 2064791 cri.go:92] found id: ""
	I1219 06:16:09.861712 2064791 logs.go:282] 0 containers: []
	W1219 06:16:09.861718 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:16:09.861723 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:16:09.861788 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:16:09.886842 2064791 cri.go:92] found id: ""
	I1219 06:16:09.886856 2064791 logs.go:282] 0 containers: []
	W1219 06:16:09.886872 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:16:09.886884 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:16:09.886956 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:16:09.912372 2064791 cri.go:92] found id: ""
	I1219 06:16:09.912387 2064791 logs.go:282] 0 containers: []
	W1219 06:16:09.912395 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:16:09.912403 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:16:09.912413 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:16:09.971481 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:16:09.971501 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:16:09.989303 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:16:09.989320 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:16:10.067493 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:16:10.058017   16811 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:10.059169   16811 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:10.059962   16811 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:10.061845   16811 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:10.062203   16811 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:16:10.058017   16811 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:10.059169   16811 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:10.059962   16811 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:10.061845   16811 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:10.062203   16811 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:16:10.067504 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:16:10.067517 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:16:10.132042 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:16:10.132062 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:16:12.664804 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:16:12.675466 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:16:12.675550 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:16:12.704963 2064791 cri.go:92] found id: ""
	I1219 06:16:12.704978 2064791 logs.go:282] 0 containers: []
	W1219 06:16:12.704985 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:16:12.704990 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:16:12.705052 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:16:12.730087 2064791 cri.go:92] found id: ""
	I1219 06:16:12.730103 2064791 logs.go:282] 0 containers: []
	W1219 06:16:12.730110 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:16:12.730115 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:16:12.730178 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:16:12.760566 2064791 cri.go:92] found id: ""
	I1219 06:16:12.760595 2064791 logs.go:282] 0 containers: []
	W1219 06:16:12.760602 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:16:12.760608 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:16:12.760675 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:16:12.785694 2064791 cri.go:92] found id: ""
	I1219 06:16:12.785707 2064791 logs.go:282] 0 containers: []
	W1219 06:16:12.785714 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:16:12.785719 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:16:12.785781 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:16:12.811923 2064791 cri.go:92] found id: ""
	I1219 06:16:12.811938 2064791 logs.go:282] 0 containers: []
	W1219 06:16:12.811956 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:16:12.811962 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:16:12.812036 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:16:12.838424 2064791 cri.go:92] found id: ""
	I1219 06:16:12.838438 2064791 logs.go:282] 0 containers: []
	W1219 06:16:12.838445 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:16:12.838451 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:16:12.838514 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:16:12.864177 2064791 cri.go:92] found id: ""
	I1219 06:16:12.864191 2064791 logs.go:282] 0 containers: []
	W1219 06:16:12.864198 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:16:12.864206 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:16:12.864216 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:16:12.920882 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:16:12.920904 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:16:12.937942 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:16:12.937959 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:16:13.004209 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:16:12.994302   16911 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:12.994966   16911 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:12.996691   16911 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:12.997250   16911 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:12.998931   16911 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:16:12.994302   16911 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:12.994966   16911 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:12.996691   16911 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:12.997250   16911 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:12.998931   16911 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:16:13.004223 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:16:13.004247 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:16:13.067051 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:16:13.067071 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:16:15.596451 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:16:15.606953 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:16:15.607013 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:16:15.639546 2064791 cri.go:92] found id: ""
	I1219 06:16:15.639560 2064791 logs.go:282] 0 containers: []
	W1219 06:16:15.639569 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:16:15.639574 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:16:15.639637 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:16:15.667230 2064791 cri.go:92] found id: ""
	I1219 06:16:15.667245 2064791 logs.go:282] 0 containers: []
	W1219 06:16:15.667252 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:16:15.667257 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:16:15.667321 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:16:15.693059 2064791 cri.go:92] found id: ""
	I1219 06:16:15.693073 2064791 logs.go:282] 0 containers: []
	W1219 06:16:15.693080 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:16:15.693086 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:16:15.693145 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:16:15.718341 2064791 cri.go:92] found id: ""
	I1219 06:16:15.718356 2064791 logs.go:282] 0 containers: []
	W1219 06:16:15.718363 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:16:15.718368 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:16:15.718437 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:16:15.744544 2064791 cri.go:92] found id: ""
	I1219 06:16:15.744559 2064791 logs.go:282] 0 containers: []
	W1219 06:16:15.744566 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:16:15.744571 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:16:15.744632 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:16:15.769809 2064791 cri.go:92] found id: ""
	I1219 06:16:15.769823 2064791 logs.go:282] 0 containers: []
	W1219 06:16:15.769830 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:16:15.769836 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:16:15.769897 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:16:15.793872 2064791 cri.go:92] found id: ""
	I1219 06:16:15.793887 2064791 logs.go:282] 0 containers: []
	W1219 06:16:15.793894 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:16:15.793902 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:16:15.793914 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:16:15.811209 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:16:15.811228 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:16:15.875495 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:16:15.867475   17013 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:15.868031   17013 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:15.869581   17013 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:15.870044   17013 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:15.871521   17013 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:16:15.867475   17013 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:15.868031   17013 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:15.869581   17013 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:15.870044   17013 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:15.871521   17013 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:16:15.875504 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:16:15.875516 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:16:15.938869 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:16:15.938889 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:16:15.967183 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:16:15.967200 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:16:18.524056 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:16:18.534213 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:16:18.534283 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:16:18.558904 2064791 cri.go:92] found id: ""
	I1219 06:16:18.558918 2064791 logs.go:282] 0 containers: []
	W1219 06:16:18.558924 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:16:18.558929 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:16:18.558994 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:16:18.583638 2064791 cri.go:92] found id: ""
	I1219 06:16:18.583653 2064791 logs.go:282] 0 containers: []
	W1219 06:16:18.583661 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:16:18.583666 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:16:18.583726 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:16:18.611047 2064791 cri.go:92] found id: ""
	I1219 06:16:18.611061 2064791 logs.go:282] 0 containers: []
	W1219 06:16:18.611068 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:16:18.611073 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:16:18.611133 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:16:18.635234 2064791 cri.go:92] found id: ""
	I1219 06:16:18.635248 2064791 logs.go:282] 0 containers: []
	W1219 06:16:18.635255 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:16:18.635261 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:16:18.635322 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:16:18.658732 2064791 cri.go:92] found id: ""
	I1219 06:16:18.658747 2064791 logs.go:282] 0 containers: []
	W1219 06:16:18.658754 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:16:18.658759 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:16:18.658819 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:16:18.687782 2064791 cri.go:92] found id: ""
	I1219 06:16:18.687796 2064791 logs.go:282] 0 containers: []
	W1219 06:16:18.687803 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:16:18.687808 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:16:18.687871 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:16:18.713641 2064791 cri.go:92] found id: ""
	I1219 06:16:18.713655 2064791 logs.go:282] 0 containers: []
	W1219 06:16:18.713662 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:16:18.713670 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:16:18.713687 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:16:18.730768 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:16:18.730786 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:16:18.797385 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:16:18.788999   17117 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:18.789629   17117 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:18.791299   17117 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:18.791871   17117 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:18.793492   17117 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:16:18.788999   17117 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:18.789629   17117 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:18.791299   17117 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:18.791871   17117 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:18.793492   17117 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:16:18.797396 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:16:18.797406 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:16:18.861009 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:16:18.861029 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:16:18.889085 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:16:18.889102 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:16:21.448880 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:16:21.458996 2064791 kubeadm.go:602] duration metric: took 4m4.592886052s to restartPrimaryControlPlane
	W1219 06:16:21.459078 2064791 out.go:285] ! Unable to restart control-plane node(s), will reset cluster: <no value>
	I1219 06:16:21.459152 2064791 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1219 06:16:21.873036 2064791 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1219 06:16:21.887075 2064791 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1219 06:16:21.894868 2064791 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1219 06:16:21.894925 2064791 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1219 06:16:21.902909 2064791 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1219 06:16:21.902919 2064791 kubeadm.go:158] found existing configuration files:
	
	I1219 06:16:21.902973 2064791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1219 06:16:21.912282 2064791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1219 06:16:21.912342 2064791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1219 06:16:21.920310 2064791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1219 06:16:21.928090 2064791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1219 06:16:21.928158 2064791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1219 06:16:21.935829 2064791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1219 06:16:21.944085 2064791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1219 06:16:21.944143 2064791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1219 06:16:21.951866 2064791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1219 06:16:21.959883 2064791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1219 06:16:21.959950 2064791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1219 06:16:21.967628 2064791 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1219 06:16:22.006002 2064791 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-rc.1
	I1219 06:16:22.006076 2064791 kubeadm.go:319] [preflight] Running pre-flight checks
	I1219 06:16:22.084826 2064791 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1219 06:16:22.084890 2064791 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1219 06:16:22.084925 2064791 kubeadm.go:319] OS: Linux
	I1219 06:16:22.084969 2064791 kubeadm.go:319] CGROUPS_CPU: enabled
	I1219 06:16:22.085017 2064791 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1219 06:16:22.085068 2064791 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1219 06:16:22.085115 2064791 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1219 06:16:22.085163 2064791 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1219 06:16:22.085209 2064791 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1219 06:16:22.085254 2064791 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1219 06:16:22.085302 2064791 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1219 06:16:22.085348 2064791 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1219 06:16:22.154531 2064791 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1219 06:16:22.154670 2064791 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1219 06:16:22.154781 2064791 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1219 06:16:22.163477 2064791 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1219 06:16:22.169007 2064791 out.go:252]   - Generating certificates and keys ...
	I1219 06:16:22.169099 2064791 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1219 06:16:22.169162 2064791 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1219 06:16:22.169237 2064791 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1219 06:16:22.169297 2064791 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1219 06:16:22.169372 2064791 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1219 06:16:22.169426 2064791 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1219 06:16:22.169488 2064791 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1219 06:16:22.169549 2064791 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1219 06:16:22.169633 2064791 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1219 06:16:22.169704 2064791 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1219 06:16:22.169741 2064791 kubeadm.go:319] [certs] Using the existing "sa" key
	I1219 06:16:22.169795 2064791 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1219 06:16:22.320644 2064791 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1219 06:16:22.743805 2064791 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1219 06:16:22.867878 2064791 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1219 06:16:22.974729 2064791 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1219 06:16:23.395365 2064791 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1219 06:16:23.396030 2064791 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1219 06:16:23.399355 2064791 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1219 06:16:23.402464 2064791 out.go:252]   - Booting up control plane ...
	I1219 06:16:23.402561 2064791 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1219 06:16:23.402637 2064791 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1219 06:16:23.403521 2064791 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1219 06:16:23.423590 2064791 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1219 06:16:23.423990 2064791 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1219 06:16:23.431661 2064791 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1219 06:16:23.431897 2064791 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1219 06:16:23.432074 2064791 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1219 06:16:23.567443 2064791 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1219 06:16:23.567557 2064791 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1219 06:20:23.567966 2064791 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.000561406s
	I1219 06:20:23.567991 2064791 kubeadm.go:319] 
	I1219 06:20:23.568084 2064791 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1219 06:20:23.568128 2064791 kubeadm.go:319] 	- The kubelet is not running
	I1219 06:20:23.568239 2064791 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1219 06:20:23.568244 2064791 kubeadm.go:319] 
	I1219 06:20:23.568354 2064791 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1219 06:20:23.568390 2064791 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1219 06:20:23.568420 2064791 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1219 06:20:23.568423 2064791 kubeadm.go:319] 
	I1219 06:20:23.572732 2064791 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1219 06:20:23.573205 2064791 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1219 06:20:23.573348 2064791 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1219 06:20:23.573651 2064791 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	I1219 06:20:23.573656 2064791 kubeadm.go:319] 
	W1219 06:20:23.573846 2064791 out.go:285] ! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000561406s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	
	To see the stack trace of this error execute with --v=5 or higher
	
	I1219 06:20:23.573948 2064791 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1219 06:20:23.574218 2064791 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1219 06:20:23.984042 2064791 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1219 06:20:23.997740 2064791 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1219 06:20:23.997798 2064791 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1219 06:20:24.008638 2064791 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1219 06:20:24.008649 2064791 kubeadm.go:158] found existing configuration files:
	
	I1219 06:20:24.008724 2064791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1219 06:20:24.018051 2064791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1219 06:20:24.018112 2064791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1219 06:20:24.026089 2064791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1219 06:20:24.034468 2064791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1219 06:20:24.034524 2064791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1219 06:20:24.042330 2064791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1219 06:20:24.050325 2064791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1219 06:20:24.050390 2064791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1219 06:20:24.058263 2064791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1219 06:20:24.066872 2064791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1219 06:20:24.066933 2064791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1219 06:20:24.075206 2064791 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1219 06:20:24.113532 2064791 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-rc.1
	I1219 06:20:24.113595 2064791 kubeadm.go:319] [preflight] Running pre-flight checks
	I1219 06:20:24.190273 2064791 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1219 06:20:24.190347 2064791 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1219 06:20:24.190399 2064791 kubeadm.go:319] OS: Linux
	I1219 06:20:24.190447 2064791 kubeadm.go:319] CGROUPS_CPU: enabled
	I1219 06:20:24.190497 2064791 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1219 06:20:24.190547 2064791 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1219 06:20:24.190597 2064791 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1219 06:20:24.190648 2064791 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1219 06:20:24.190697 2064791 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1219 06:20:24.190745 2064791 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1219 06:20:24.190796 2064791 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1219 06:20:24.190844 2064791 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1219 06:20:24.261095 2064791 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1219 06:20:24.261198 2064791 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1219 06:20:24.261287 2064791 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1219 06:20:24.273343 2064791 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1219 06:20:24.278556 2064791 out.go:252]   - Generating certificates and keys ...
	I1219 06:20:24.278645 2064791 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1219 06:20:24.278707 2064791 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1219 06:20:24.278781 2064791 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1219 06:20:24.278840 2064791 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1219 06:20:24.278908 2064791 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1219 06:20:24.278961 2064791 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1219 06:20:24.279023 2064791 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1219 06:20:24.279082 2064791 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1219 06:20:24.279155 2064791 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1219 06:20:24.279227 2064791 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1219 06:20:24.279263 2064791 kubeadm.go:319] [certs] Using the existing "sa" key
	I1219 06:20:24.279319 2064791 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1219 06:20:24.586742 2064791 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1219 06:20:24.705000 2064791 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1219 06:20:25.117117 2064791 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1219 06:20:25.207046 2064791 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1219 06:20:25.407003 2064791 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1219 06:20:25.408181 2064791 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1219 06:20:25.412332 2064791 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1219 06:20:25.415422 2064791 out.go:252]   - Booting up control plane ...
	I1219 06:20:25.415519 2064791 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1219 06:20:25.415596 2064791 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1219 06:20:25.415664 2064791 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1219 06:20:25.435196 2064791 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1219 06:20:25.435555 2064791 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1219 06:20:25.442782 2064791 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1219 06:20:25.443056 2064791 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1219 06:20:25.443098 2064791 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1219 06:20:25.586740 2064791 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1219 06:20:25.586852 2064791 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1219 06:24:25.586924 2064791 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.000209622s
	I1219 06:24:25.586949 2064791 kubeadm.go:319] 
	I1219 06:24:25.587005 2064791 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1219 06:24:25.587037 2064791 kubeadm.go:319] 	- The kubelet is not running
	I1219 06:24:25.587152 2064791 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1219 06:24:25.587157 2064791 kubeadm.go:319] 
	I1219 06:24:25.587305 2064791 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1219 06:24:25.587351 2064791 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1219 06:24:25.587399 2064791 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1219 06:24:25.587405 2064791 kubeadm.go:319] 
	I1219 06:24:25.592745 2064791 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1219 06:24:25.593206 2064791 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1219 06:24:25.593358 2064791 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1219 06:24:25.593654 2064791 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	I1219 06:24:25.593660 2064791 kubeadm.go:319] 
	I1219 06:24:25.593751 2064791 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1219 06:24:25.593818 2064791 kubeadm.go:403] duration metric: took 12m8.761907578s to StartCluster
	I1219 06:24:25.593849 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:24:25.593915 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:24:25.619076 2064791 cri.go:92] found id: ""
	I1219 06:24:25.619090 2064791 logs.go:282] 0 containers: []
	W1219 06:24:25.619097 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:24:25.619103 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:24:25.619166 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:24:25.645501 2064791 cri.go:92] found id: ""
	I1219 06:24:25.645515 2064791 logs.go:282] 0 containers: []
	W1219 06:24:25.645522 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:24:25.645527 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:24:25.645587 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:24:25.671211 2064791 cri.go:92] found id: ""
	I1219 06:24:25.671225 2064791 logs.go:282] 0 containers: []
	W1219 06:24:25.671232 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:24:25.671237 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:24:25.671297 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:24:25.695076 2064791 cri.go:92] found id: ""
	I1219 06:24:25.695090 2064791 logs.go:282] 0 containers: []
	W1219 06:24:25.695098 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:24:25.695104 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:24:25.695165 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:24:25.720717 2064791 cri.go:92] found id: ""
	I1219 06:24:25.720733 2064791 logs.go:282] 0 containers: []
	W1219 06:24:25.720740 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:24:25.720745 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:24:25.720832 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:24:25.746445 2064791 cri.go:92] found id: ""
	I1219 06:24:25.746460 2064791 logs.go:282] 0 containers: []
	W1219 06:24:25.746466 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:24:25.746478 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:24:25.746541 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:24:25.771217 2064791 cri.go:92] found id: ""
	I1219 06:24:25.771231 2064791 logs.go:282] 0 containers: []
	W1219 06:24:25.771238 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:24:25.771249 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:24:25.771259 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:24:25.827848 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:24:25.827867 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:24:25.845454 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:24:25.845470 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:24:25.916464 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:24:25.906852   20933 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:24:25.907635   20933 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:24:25.909247   20933 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:24:25.909952   20933 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:24:25.911845   20933 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:24:25.906852   20933 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:24:25.907635   20933 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:24:25.909247   20933 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:24:25.909952   20933 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:24:25.911845   20933 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:24:25.916485 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:24:25.916495 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:24:25.988149 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:24:25.988168 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	W1219 06:24:26.019538 2064791 out.go:434] Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000209622s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	
	To see the stack trace of this error execute with --v=5 or higher
	W1219 06:24:26.019579 2064791 out.go:285] * 
	W1219 06:24:26.019696 2064791 out.go:285] X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000209622s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1219 06:24:26.019769 2064791 out.go:285] * 
	W1219 06:24:26.022296 2064791 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1219 06:24:26.028311 2064791 out.go:203] 
	W1219 06:24:26.031204 2064791 out.go:285] X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000209622s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1219 06:24:26.031251 2064791 out.go:285] * Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	W1219 06:24:26.031270 2064791 out.go:285] * Related issue: https://github.com/kubernetes/minikube/issues/4172
	I1219 06:24:26.034280 2064791 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.483798627Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.483867223Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.483965234Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.484038564Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.484104559Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.484166960Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.484226562Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.484289119Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.484361021Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.484452469Z" level=info msg="Connect containerd service"
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.484896289Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this"
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.485577404Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config"
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.498876654Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.499249089Z" level=info msg=serving... address=/run/containerd/containerd.sock
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.498953709Z" level=info msg="Start subscribing containerd event"
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.499359457Z" level=info msg="Start recovering state"
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.541527820Z" level=info msg="Start event monitor"
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.541744389Z" level=info msg="Start cni network conf syncer for default"
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.541814527Z" level=info msg="Start streaming server"
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.541876723Z" level=info msg="Registered namespace \"k8s.io\" with NRI"
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.541934873Z" level=info msg="runtime interface starting up..."
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.541989897Z" level=info msg="starting plugins..."
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.542066690Z" level=info msg="Synchronizing NRI (plugin) with current runtime state"
	Dec 19 06:12:15 functional-006924 systemd[1]: Started containerd.service - containerd container runtime.
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.544548112Z" level=info msg="containerd successfully booted in 0.093860s"
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:26:28.283823   23089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:26:28.284344   23089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:26:28.286077   23089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:26:28.286532   23089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:26:28.288066   23089 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec19 04:47] overlayfs: idmapped layers are currently not supported
	[Dec19 04:48] overlayfs: idmapped layers are currently not supported
	[Dec19 04:49] overlayfs: idmapped layers are currently not supported
	[Dec19 04:51] overlayfs: idmapped layers are currently not supported
	[Dec19 04:53] overlayfs: idmapped layers are currently not supported
	[Dec19 05:03] overlayfs: idmapped layers are currently not supported
	[Dec19 05:04] overlayfs: idmapped layers are currently not supported
	[Dec19 05:05] overlayfs: idmapped layers are currently not supported
	[Dec19 05:06] overlayfs: idmapped layers are currently not supported
	[ +12.793339] overlayfs: idmapped layers are currently not supported
	[Dec19 05:07] overlayfs: idmapped layers are currently not supported
	[Dec19 05:08] overlayfs: idmapped layers are currently not supported
	[Dec19 05:09] overlayfs: idmapped layers are currently not supported
	[Dec19 05:10] overlayfs: idmapped layers are currently not supported
	[Dec19 05:11] overlayfs: idmapped layers are currently not supported
	[Dec19 05:13] overlayfs: idmapped layers are currently not supported
	[Dec19 05:14] overlayfs: idmapped layers are currently not supported
	[Dec19 05:32] overlayfs: idmapped layers are currently not supported
	[Dec19 05:33] overlayfs: idmapped layers are currently not supported
	[Dec19 05:35] overlayfs: idmapped layers are currently not supported
	[Dec19 05:36] overlayfs: idmapped layers are currently not supported
	[Dec19 05:38] overlayfs: idmapped layers are currently not supported
	[Dec19 05:39] overlayfs: idmapped layers are currently not supported
	[Dec19 05:40] overlayfs: idmapped layers are currently not supported
	[Dec19 05:42] kauditd_printk_skb: 8 callbacks suppressed
	
	
	==> kernel <==
	 06:26:28 up 11:08,  0 user,  load average: 0.63, 0.28, 0.44
	Linux functional-006924 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 19 06:26:25 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 06:26:25 functional-006924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 480.
	Dec 19 06:26:25 functional-006924 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:26:25 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:26:25 functional-006924 kubelet[22920]: E1219 06:26:25.951244   22920 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 06:26:25 functional-006924 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 06:26:25 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 06:26:26 functional-006924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 481.
	Dec 19 06:26:26 functional-006924 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:26:26 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:26:26 functional-006924 kubelet[22962]: E1219 06:26:26.707681   22962 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 06:26:26 functional-006924 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 06:26:26 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 06:26:27 functional-006924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 482.
	Dec 19 06:26:27 functional-006924 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:26:27 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:26:27 functional-006924 kubelet[22998]: E1219 06:26:27.388927   22998 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 06:26:27 functional-006924 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 06:26:27 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 06:26:28 functional-006924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 483.
	Dec 19 06:26:28 functional-006924 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:26:28 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:26:28 functional-006924 kubelet[23074]: E1219 06:26:28.207180   23074 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 06:26:28 functional-006924 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 06:26:28 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:263: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-006924 -n functional-006924
helpers_test.go:263: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-006924 -n functional-006924: exit status 2 (379.070896ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:263: status error: exit status 2 (may be ok)
helpers_test.go:265: "functional-006924" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/StatusCmd (3.11s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmdConnect (2.41s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmdConnect
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmdConnect

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmdConnect
functional_test.go:1636: (dbg) Run:  kubectl --context functional-006924 create deployment hello-node-connect --image kicbase/echo-server
functional_test.go:1636: (dbg) Non-zero exit: kubectl --context functional-006924 create deployment hello-node-connect --image kicbase/echo-server: exit status 1 (55.949721ms)

                                                
                                                
** stderr ** 
	error: failed to create deployment: Post "https://192.168.49.2:8441/apis/apps/v1/namespaces/default/deployments?fieldManager=kubectl-create&fieldValidation=Strict": dial tcp 192.168.49.2:8441: connect: connection refused

                                                
                                                
** /stderr **
functional_test.go:1638: failed to create hello-node deployment with this command "kubectl --context functional-006924 create deployment hello-node-connect --image kicbase/echo-server": exit status 1.
functional_test.go:1608: service test failed - dumping debug information
functional_test.go:1609: -----------------------service failure post-mortem--------------------------------
functional_test.go:1612: (dbg) Run:  kubectl --context functional-006924 describe po hello-node-connect
functional_test.go:1612: (dbg) Non-zero exit: kubectl --context functional-006924 describe po hello-node-connect: exit status 1 (57.116265ms)

                                                
                                                
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:1614: "kubectl --context functional-006924 describe po hello-node-connect" failed: exit status 1
functional_test.go:1616: hello-node pod describe:
functional_test.go:1618: (dbg) Run:  kubectl --context functional-006924 logs -l app=hello-node-connect
functional_test.go:1618: (dbg) Non-zero exit: kubectl --context functional-006924 logs -l app=hello-node-connect: exit status 1 (61.433054ms)

                                                
                                                
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:1620: "kubectl --context functional-006924 logs -l app=hello-node-connect" failed: exit status 1
functional_test.go:1622: hello-node logs:
functional_test.go:1624: (dbg) Run:  kubectl --context functional-006924 describe svc hello-node-connect
functional_test.go:1624: (dbg) Non-zero exit: kubectl --context functional-006924 describe svc hello-node-connect: exit status 1 (63.835425ms)

                                                
                                                
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:1626: "kubectl --context functional-006924 describe svc hello-node-connect" failed: exit status 1
functional_test.go:1628: hello-node svc describe:
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:224: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmdConnect]: network settings <======
helpers_test.go:231: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:239: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmdConnect]: docker inspect <======
helpers_test.go:240: (dbg) Run:  docker inspect functional-006924
helpers_test.go:244: (dbg) docker inspect functional-006924:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6",
	        "Created": "2025-12-19T05:57:32.987616309Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 2053574,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-19T05:57:33.050252475Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:1411dfa4fea1291ce69fcd55acb99f3fbff3e701cee30fdd4f0b2561ac0ef6b0",
	        "ResolvConfPath": "/var/lib/docker/containers/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6/hostname",
	        "HostsPath": "/var/lib/docker/containers/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6/hosts",
	        "LogPath": "/var/lib/docker/containers/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6-json.log",
	        "Name": "/functional-006924",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-006924:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-006924",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6",
	                "LowerDir": "/var/lib/docker/overlay2/37e64f46ab514c62069e4338bcac4816059c7746e8935882d7526ec5f3a01f73-init/diff:/var/lib/docker/overlay2/00358d85eab3b52f9d297862c5ac97673efd866f7bb8f8781bf0c1744f50abc5/diff",
	                "MergedDir": "/var/lib/docker/overlay2/37e64f46ab514c62069e4338bcac4816059c7746e8935882d7526ec5f3a01f73/merged",
	                "UpperDir": "/var/lib/docker/overlay2/37e64f46ab514c62069e4338bcac4816059c7746e8935882d7526ec5f3a01f73/diff",
	                "WorkDir": "/var/lib/docker/overlay2/37e64f46ab514c62069e4338bcac4816059c7746e8935882d7526ec5f3a01f73/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-006924",
	                "Source": "/var/lib/docker/volumes/functional-006924/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-006924",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-006924",
	                "name.minikube.sigs.k8s.io": "functional-006924",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "c06ab2bd44169716d410789ed39ed6e7c04e20cbf7fddb96691439282b9c97ca",
	            "SandboxKey": "/var/run/docker/netns/c06ab2bd4416",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34704"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34705"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34708"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34706"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34707"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-006924": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "42:2f:87:6a:a8:7b",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "f63e8dc2cff83663f8a4d14108f192e61e457410fa4fc720cd9630dbf354815d",
	                    "EndpointID": "aa2b1cbd90d5c1f6130481423d97f82d974d4197e41ad0dbe3b7e51b22c8b4cc",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-006924",
	                        "651d0d6ef1db"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:248: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-006924 -n functional-006924
helpers_test.go:248: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-006924 -n functional-006924: exit status 2 (305.696377ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:248: status error: exit status 2 (may be ok)
helpers_test.go:253: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmdConnect FAILED: start of post-mortem logs <<<
helpers_test.go:254: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmdConnect]: minikube logs <======
helpers_test.go:256: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 logs -n 25
helpers_test.go:261: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmdConnect logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                             ARGS                                                                             │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ cache   │ functional-006924 cache reload                                                                                                                               │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ ssh     │ functional-006924 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                                      │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.1                                                                                                                             │ minikube          │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ cache   │ delete registry.k8s.io/pause:latest                                                                                                                          │ minikube          │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │ 19 Dec 25 06:12 UTC │
	│ kubectl │ functional-006924 kubectl -- --context functional-006924 get pods                                                                                            │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │                     │
	│ start   │ -p functional-006924 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all                                                     │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:12 UTC │                     │
	│ config  │ functional-006924 config unset cpus                                                                                                                          │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:24 UTC │ 19 Dec 25 06:24 UTC │
	│ cp      │ functional-006924 cp testdata/cp-test.txt /home/docker/cp-test.txt                                                                                           │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:24 UTC │ 19 Dec 25 06:24 UTC │
	│ config  │ functional-006924 config get cpus                                                                                                                            │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:24 UTC │                     │
	│ config  │ functional-006924 config set cpus 2                                                                                                                          │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:24 UTC │ 19 Dec 25 06:24 UTC │
	│ config  │ functional-006924 config get cpus                                                                                                                            │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:24 UTC │ 19 Dec 25 06:24 UTC │
	│ config  │ functional-006924 config unset cpus                                                                                                                          │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:24 UTC │ 19 Dec 25 06:24 UTC │
	│ ssh     │ functional-006924 ssh -n functional-006924 sudo cat /home/docker/cp-test.txt                                                                                 │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:24 UTC │ 19 Dec 25 06:24 UTC │
	│ config  │ functional-006924 config get cpus                                                                                                                            │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:24 UTC │                     │
	│ ssh     │ functional-006924 ssh echo hello                                                                                                                             │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:24 UTC │ 19 Dec 25 06:24 UTC │
	│ cp      │ functional-006924 cp functional-006924:/home/docker/cp-test.txt /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelCpCm1537732802/001/cp-test.txt │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:24 UTC │ 19 Dec 25 06:24 UTC │
	│ ssh     │ functional-006924 ssh cat /etc/hostname                                                                                                                      │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:24 UTC │ 19 Dec 25 06:24 UTC │
	│ ssh     │ functional-006924 ssh -n functional-006924 sudo cat /home/docker/cp-test.txt                                                                                 │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:24 UTC │ 19 Dec 25 06:24 UTC │
	│ tunnel  │ functional-006924 tunnel --alsologtostderr                                                                                                                   │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:24 UTC │                     │
	│ tunnel  │ functional-006924 tunnel --alsologtostderr                                                                                                                   │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:24 UTC │                     │
	│ cp      │ functional-006924 cp testdata/cp-test.txt /tmp/does/not/exist/cp-test.txt                                                                                    │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:24 UTC │ 19 Dec 25 06:24 UTC │
	│ tunnel  │ functional-006924 tunnel --alsologtostderr                                                                                                                   │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:24 UTC │                     │
	│ ssh     │ functional-006924 ssh -n functional-006924 sudo cat /tmp/does/not/exist/cp-test.txt                                                                          │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:24 UTC │ 19 Dec 25 06:24 UTC │
	│ addons  │ functional-006924 addons list                                                                                                                                │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ addons  │ functional-006924 addons list -o json                                                                                                                        │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	└─────────┴──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/19 06:12:12
	Running on machine: ip-172-31-30-239
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1219 06:12:12.743158 2064791 out.go:360] Setting OutFile to fd 1 ...
	I1219 06:12:12.743269 2064791 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 06:12:12.743273 2064791 out.go:374] Setting ErrFile to fd 2...
	I1219 06:12:12.743277 2064791 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 06:12:12.743528 2064791 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22230-1998525/.minikube/bin
	I1219 06:12:12.743902 2064791 out.go:368] Setting JSON to false
	I1219 06:12:12.744837 2064791 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":39279,"bootTime":1766085454,"procs":155,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1219 06:12:12.744896 2064791 start.go:143] virtualization:  
	I1219 06:12:12.748217 2064791 out.go:179] * [functional-006924] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1219 06:12:12.751238 2064791 out.go:179]   - MINIKUBE_LOCATION=22230
	I1219 06:12:12.751295 2064791 notify.go:221] Checking for updates...
	I1219 06:12:12.757153 2064791 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1219 06:12:12.760103 2064791 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22230-1998525/kubeconfig
	I1219 06:12:12.763068 2064791 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22230-1998525/.minikube
	I1219 06:12:12.765948 2064791 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1219 06:12:12.768902 2064791 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1219 06:12:12.772437 2064791 config.go:182] Loaded profile config "functional-006924": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1219 06:12:12.772538 2064791 driver.go:422] Setting default libvirt URI to qemu:///system
	I1219 06:12:12.804424 2064791 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1219 06:12:12.804525 2064791 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1219 06:12:12.859954 2064791 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-12-19 06:12:12.850685523 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1219 06:12:12.860047 2064791 docker.go:319] overlay module found
	I1219 06:12:12.863098 2064791 out.go:179] * Using the docker driver based on existing profile
	I1219 06:12:12.866014 2064791 start.go:309] selected driver: docker
	I1219 06:12:12.866030 2064791 start.go:928] validating driver "docker" against &{Name:functional-006924 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APISer
verName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableC
oreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1219 06:12:12.866122 2064791 start.go:939] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1219 06:12:12.866232 2064791 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1219 06:12:12.920329 2064791 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-12-19 06:12:12.911575892 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1219 06:12:12.920732 2064791 start_flags.go:993] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1219 06:12:12.920793 2064791 cni.go:84] Creating CNI manager for ""
	I1219 06:12:12.920848 2064791 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1219 06:12:12.920889 2064791 start.go:353] cluster config:
	{Name:functional-006924 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Con
tainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCo
reDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1219 06:12:12.924076 2064791 out.go:179] * Starting "functional-006924" primary control-plane node in "functional-006924" cluster
	I1219 06:12:12.926767 2064791 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1219 06:12:12.929823 2064791 out.go:179] * Pulling base image v0.0.48-1765966054-22186 ...
	I1219 06:12:12.932605 2064791 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1219 06:12:12.932642 2064791 preload.go:203] Found local preload: /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4
	I1219 06:12:12.932650 2064791 cache.go:65] Caching tarball of preloaded images
	I1219 06:12:12.932677 2064791 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon
	I1219 06:12:12.932745 2064791 preload.go:238] Found /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4 in cache, skipping download
	I1219 06:12:12.932796 2064791 cache.go:68] Finished verifying existence of preloaded tar for v1.35.0-rc.1 on containerd
	I1219 06:12:12.932911 2064791 profile.go:143] Saving config to /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/config.json ...
	I1219 06:12:12.951789 2064791 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon, skipping pull
	I1219 06:12:12.951800 2064791 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 exists in daemon, skipping load
	I1219 06:12:12.951830 2064791 cache.go:243] Successfully downloaded all kic artifacts
	I1219 06:12:12.951863 2064791 start.go:360] acquireMachinesLock for functional-006924: {Name:mkc84f48e83d18024791d45db780f3ccd746613a Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1219 06:12:12.951927 2064791 start.go:364] duration metric: took 47.033µs to acquireMachinesLock for "functional-006924"
	I1219 06:12:12.951947 2064791 start.go:96] Skipping create...Using existing machine configuration
	I1219 06:12:12.951951 2064791 fix.go:54] fixHost starting: 
	I1219 06:12:12.952210 2064791 cli_runner.go:164] Run: docker container inspect functional-006924 --format={{.State.Status}}
	I1219 06:12:12.969279 2064791 fix.go:112] recreateIfNeeded on functional-006924: state=Running err=<nil>
	W1219 06:12:12.969299 2064791 fix.go:138] unexpected machine state, will restart: <nil>
	I1219 06:12:12.972432 2064791 out.go:252] * Updating the running docker "functional-006924" container ...
	I1219 06:12:12.972457 2064791 machine.go:94] provisionDockerMachine start ...
	I1219 06:12:12.972536 2064791 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:12:12.989705 2064791 main.go:144] libmachine: Using SSH client type: native
	I1219 06:12:12.990045 2064791 main.go:144] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db5e0] 0x3ddae0 <nil>  [] 0s} 127.0.0.1 34704 <nil> <nil>}
	I1219 06:12:12.990052 2064791 main.go:144] libmachine: About to run SSH command:
	hostname
	I1219 06:12:13.144528 2064791 main.go:144] libmachine: SSH cmd err, output: <nil>: functional-006924
	
	I1219 06:12:13.144543 2064791 ubuntu.go:182] provisioning hostname "functional-006924"
	I1219 06:12:13.144626 2064791 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:12:13.163735 2064791 main.go:144] libmachine: Using SSH client type: native
	I1219 06:12:13.164043 2064791 main.go:144] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db5e0] 0x3ddae0 <nil>  [] 0s} 127.0.0.1 34704 <nil> <nil>}
	I1219 06:12:13.164057 2064791 main.go:144] libmachine: About to run SSH command:
	sudo hostname functional-006924 && echo "functional-006924" | sudo tee /etc/hostname
	I1219 06:12:13.331538 2064791 main.go:144] libmachine: SSH cmd err, output: <nil>: functional-006924
	
	I1219 06:12:13.331610 2064791 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:12:13.350490 2064791 main.go:144] libmachine: Using SSH client type: native
	I1219 06:12:13.350800 2064791 main.go:144] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db5e0] 0x3ddae0 <nil>  [] 0s} 127.0.0.1 34704 <nil> <nil>}
	I1219 06:12:13.350813 2064791 main.go:144] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-006924' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-006924/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-006924' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1219 06:12:13.509192 2064791 main.go:144] libmachine: SSH cmd err, output: <nil>: 
	I1219 06:12:13.509210 2064791 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/22230-1998525/.minikube CaCertPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/22230-1998525/.minikube}
	I1219 06:12:13.509245 2064791 ubuntu.go:190] setting up certificates
	I1219 06:12:13.509254 2064791 provision.go:84] configureAuth start
	I1219 06:12:13.509315 2064791 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-006924
	I1219 06:12:13.528067 2064791 provision.go:143] copyHostCerts
	I1219 06:12:13.528151 2064791 exec_runner.go:144] found /home/jenkins/minikube-integration/22230-1998525/.minikube/key.pem, removing ...
	I1219 06:12:13.528164 2064791 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22230-1998525/.minikube/key.pem
	I1219 06:12:13.528239 2064791 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/22230-1998525/.minikube/key.pem (1671 bytes)
	I1219 06:12:13.528339 2064791 exec_runner.go:144] found /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.pem, removing ...
	I1219 06:12:13.528348 2064791 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.pem
	I1219 06:12:13.528375 2064791 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.pem (1078 bytes)
	I1219 06:12:13.528452 2064791 exec_runner.go:144] found /home/jenkins/minikube-integration/22230-1998525/.minikube/cert.pem, removing ...
	I1219 06:12:13.528456 2064791 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22230-1998525/.minikube/cert.pem
	I1219 06:12:13.528480 2064791 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/22230-1998525/.minikube/cert.pem (1123 bytes)
	I1219 06:12:13.528529 2064791 provision.go:117] generating server cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca-key.pem org=jenkins.functional-006924 san=[127.0.0.1 192.168.49.2 functional-006924 localhost minikube]
	I1219 06:12:13.839797 2064791 provision.go:177] copyRemoteCerts
	I1219 06:12:13.839849 2064791 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1219 06:12:13.839888 2064791 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:12:13.857134 2064791 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:12:13.968475 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1219 06:12:13.985747 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1219 06:12:14.005527 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1219 06:12:14.024925 2064791 provision.go:87] duration metric: took 515.64823ms to configureAuth
	I1219 06:12:14.024943 2064791 ubuntu.go:206] setting minikube options for container-runtime
	I1219 06:12:14.025140 2064791 config.go:182] Loaded profile config "functional-006924": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1219 06:12:14.025146 2064791 machine.go:97] duration metric: took 1.052684031s to provisionDockerMachine
	I1219 06:12:14.025152 2064791 start.go:293] postStartSetup for "functional-006924" (driver="docker")
	I1219 06:12:14.025162 2064791 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1219 06:12:14.025218 2064791 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1219 06:12:14.025263 2064791 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:12:14.043178 2064791 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:12:14.148605 2064791 ssh_runner.go:195] Run: cat /etc/os-release
	I1219 06:12:14.151719 2064791 main.go:144] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1219 06:12:14.151753 2064791 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1219 06:12:14.151766 2064791 filesync.go:126] Scanning /home/jenkins/minikube-integration/22230-1998525/.minikube/addons for local assets ...
	I1219 06:12:14.151823 2064791 filesync.go:126] Scanning /home/jenkins/minikube-integration/22230-1998525/.minikube/files for local assets ...
	I1219 06:12:14.151902 2064791 filesync.go:149] local asset: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem -> 20003862.pem in /etc/ssl/certs
	I1219 06:12:14.151975 2064791 filesync.go:149] local asset: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/test/nested/copy/2000386/hosts -> hosts in /etc/test/nested/copy/2000386
	I1219 06:12:14.152026 2064791 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/2000386
	I1219 06:12:14.159336 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem --> /etc/ssl/certs/20003862.pem (1708 bytes)
	I1219 06:12:14.177055 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/test/nested/copy/2000386/hosts --> /etc/test/nested/copy/2000386/hosts (40 bytes)
	I1219 06:12:14.195053 2064791 start.go:296] duration metric: took 169.886807ms for postStartSetup
	I1219 06:12:14.195138 2064791 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1219 06:12:14.195175 2064791 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:12:14.212871 2064791 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:12:14.317767 2064791 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1219 06:12:14.322386 2064791 fix.go:56] duration metric: took 1.37042768s for fixHost
	I1219 06:12:14.322401 2064791 start.go:83] releasing machines lock for "functional-006924", held for 1.370467196s
	I1219 06:12:14.322474 2064791 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-006924
	I1219 06:12:14.339208 2064791 ssh_runner.go:195] Run: cat /version.json
	I1219 06:12:14.339250 2064791 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:12:14.339514 2064791 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1219 06:12:14.339574 2064791 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
	I1219 06:12:14.363989 2064791 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:12:14.366009 2064791 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
	I1219 06:12:14.468260 2064791 ssh_runner.go:195] Run: systemctl --version
	I1219 06:12:14.559810 2064791 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1219 06:12:14.563901 2064791 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1219 06:12:14.563968 2064791 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1219 06:12:14.571453 2064791 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1219 06:12:14.571466 2064791 start.go:496] detecting cgroup driver to use...
	I1219 06:12:14.571496 2064791 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1219 06:12:14.571541 2064791 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1219 06:12:14.588970 2064791 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1219 06:12:14.603919 2064791 docker.go:218] disabling cri-docker service (if available) ...
	I1219 06:12:14.603971 2064791 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1219 06:12:14.620412 2064791 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1219 06:12:14.634912 2064791 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1219 06:12:14.757018 2064791 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1219 06:12:14.879281 2064791 docker.go:234] disabling docker service ...
	I1219 06:12:14.879341 2064791 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1219 06:12:14.894279 2064791 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1219 06:12:14.907362 2064791 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1219 06:12:15.033676 2064791 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1219 06:12:15.155919 2064791 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1219 06:12:15.169590 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1219 06:12:15.184917 2064791 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1219 06:12:15.194691 2064791 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1219 06:12:15.203742 2064791 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1219 06:12:15.203801 2064791 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1219 06:12:15.212945 2064791 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1219 06:12:15.221903 2064791 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1219 06:12:15.231019 2064791 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1219 06:12:15.239988 2064791 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1219 06:12:15.248292 2064791 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1219 06:12:15.257554 2064791 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1219 06:12:15.266460 2064791 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1219 06:12:15.275351 2064791 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1219 06:12:15.282864 2064791 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1219 06:12:15.290662 2064791 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1219 06:12:15.400462 2064791 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1219 06:12:15.544853 2064791 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1219 06:12:15.544914 2064791 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1219 06:12:15.549076 2064791 start.go:564] Will wait 60s for crictl version
	I1219 06:12:15.549132 2064791 ssh_runner.go:195] Run: which crictl
	I1219 06:12:15.552855 2064791 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1219 06:12:15.578380 2064791 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.2.0
	RuntimeApiVersion:  v1
	I1219 06:12:15.578461 2064791 ssh_runner.go:195] Run: containerd --version
	I1219 06:12:15.600920 2064791 ssh_runner.go:195] Run: containerd --version
	I1219 06:12:15.626436 2064791 out.go:179] * Preparing Kubernetes v1.35.0-rc.1 on containerd 2.2.0 ...
	I1219 06:12:15.629308 2064791 cli_runner.go:164] Run: docker network inspect functional-006924 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1219 06:12:15.645624 2064791 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1219 06:12:15.652379 2064791 out.go:179]   - apiserver.enable-admission-plugins=NamespaceAutoProvision
	I1219 06:12:15.655147 2064791 kubeadm.go:884] updating cluster {Name:functional-006924 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APIServerName:minikubeC
A APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMi
rror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1219 06:12:15.655272 2064791 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1219 06:12:15.655368 2064791 ssh_runner.go:195] Run: sudo crictl images --output json
	I1219 06:12:15.679674 2064791 containerd.go:627] all images are preloaded for containerd runtime.
	I1219 06:12:15.679686 2064791 containerd.go:534] Images already preloaded, skipping extraction
	I1219 06:12:15.679751 2064791 ssh_runner.go:195] Run: sudo crictl images --output json
	I1219 06:12:15.704545 2064791 containerd.go:627] all images are preloaded for containerd runtime.
	I1219 06:12:15.704557 2064791 cache_images.go:86] Images are preloaded, skipping loading
	I1219 06:12:15.704563 2064791 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-rc.1 containerd true true} ...
	I1219 06:12:15.704666 2064791 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-rc.1/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-006924 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1219 06:12:15.704733 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s info
	I1219 06:12:15.729671 2064791 extraconfig.go:125] Overwriting default enable-admission-plugins=NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota with user provided enable-admission-plugins=NamespaceAutoProvision for component apiserver
	I1219 06:12:15.729690 2064791 cni.go:84] Creating CNI manager for ""
	I1219 06:12:15.729697 2064791 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1219 06:12:15.729711 2064791 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1219 06:12:15.729738 2064791 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-rc.1 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-006924 NodeName:functional-006924 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceAutoProvision] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false Kubelet
ConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1219 06:12:15.729853 2064791 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-006924"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceAutoProvision"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-rc.1
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1219 06:12:15.729919 2064791 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-rc.1
	I1219 06:12:15.737786 2064791 binaries.go:51] Found k8s binaries, skipping transfer
	I1219 06:12:15.737845 2064791 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1219 06:12:15.745456 2064791 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (326 bytes)
	I1219 06:12:15.758378 2064791 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (357 bytes)
	I1219 06:12:15.775454 2064791 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2085 bytes)
	I1219 06:12:15.788878 2064791 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1219 06:12:15.792954 2064791 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1219 06:12:15.901526 2064791 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1219 06:12:16.150661 2064791 certs.go:69] Setting up /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924 for IP: 192.168.49.2
	I1219 06:12:16.150673 2064791 certs.go:195] generating shared ca certs ...
	I1219 06:12:16.150687 2064791 certs.go:227] acquiring lock for ca certs: {Name:mk382c71693ea4061363f97b153b21bf6cdf5f38 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1219 06:12:16.150828 2064791 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.key
	I1219 06:12:16.150868 2064791 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/proxy-client-ca.key
	I1219 06:12:16.150873 2064791 certs.go:257] generating profile certs ...
	I1219 06:12:16.150961 2064791 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.key
	I1219 06:12:16.151009 2064791 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.key.febe6fed
	I1219 06:12:16.151048 2064791 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/proxy-client.key
	I1219 06:12:16.151165 2064791 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/2000386.pem (1338 bytes)
	W1219 06:12:16.151195 2064791 certs.go:480] ignoring /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/2000386_empty.pem, impossibly tiny 0 bytes
	I1219 06:12:16.151202 2064791 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca-key.pem (1679 bytes)
	I1219 06:12:16.151230 2064791 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem (1078 bytes)
	I1219 06:12:16.151264 2064791 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/cert.pem (1123 bytes)
	I1219 06:12:16.151286 2064791 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/key.pem (1671 bytes)
	I1219 06:12:16.151329 2064791 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem (1708 bytes)
	I1219 06:12:16.151962 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1219 06:12:16.174202 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1219 06:12:16.194590 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1219 06:12:16.215085 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I1219 06:12:16.232627 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1219 06:12:16.250371 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I1219 06:12:16.267689 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1219 06:12:16.285522 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I1219 06:12:16.302837 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem --> /usr/share/ca-certificates/20003862.pem (1708 bytes)
	I1219 06:12:16.320411 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1219 06:12:16.337922 2064791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/2000386.pem --> /usr/share/ca-certificates/2000386.pem (1338 bytes)
	I1219 06:12:16.355077 2064791 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (722 bytes)
	I1219 06:12:16.368122 2064791 ssh_runner.go:195] Run: openssl version
	I1219 06:12:16.374305 2064791 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:12:16.381720 2064791 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1219 06:12:16.389786 2064791 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:12:16.393456 2064791 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 19 05:43 /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:12:16.393514 2064791 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:12:16.434859 2064791 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1219 06:12:16.442942 2064791 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/2000386.pem
	I1219 06:12:16.450665 2064791 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/2000386.pem /etc/ssl/certs/2000386.pem
	I1219 06:12:16.458612 2064791 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/2000386.pem
	I1219 06:12:16.462545 2064791 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 19 05:57 /usr/share/ca-certificates/2000386.pem
	I1219 06:12:16.462603 2064791 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/2000386.pem
	I1219 06:12:16.503732 2064791 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1219 06:12:16.511394 2064791 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/20003862.pem
	I1219 06:12:16.519328 2064791 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/20003862.pem /etc/ssl/certs/20003862.pem
	I1219 06:12:16.526844 2064791 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/20003862.pem
	I1219 06:12:16.530487 2064791 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 19 05:57 /usr/share/ca-certificates/20003862.pem
	I1219 06:12:16.530547 2064791 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/20003862.pem
	I1219 06:12:16.571532 2064791 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1219 06:12:16.579524 2064791 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1219 06:12:16.583470 2064791 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1219 06:12:16.624483 2064791 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1219 06:12:16.665575 2064791 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1219 06:12:16.707109 2064791 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1219 06:12:16.749520 2064791 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1219 06:12:16.790988 2064791 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1219 06:12:16.831921 2064791 kubeadm.go:401] StartCluster: {Name:functional-006924 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APIServerName:minikubeCA A
PIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirro
r: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1219 06:12:16.832006 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1219 06:12:16.832084 2064791 ssh_runner.go:195] Run: sudo -s eval "crictl --timeout=10s ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1219 06:12:16.857771 2064791 cri.go:92] found id: ""
	I1219 06:12:16.857833 2064791 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1219 06:12:16.866091 2064791 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1219 06:12:16.866101 2064791 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1219 06:12:16.866158 2064791 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1219 06:12:16.873926 2064791 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1219 06:12:16.874482 2064791 kubeconfig.go:125] found "functional-006924" server: "https://192.168.49.2:8441"
	I1219 06:12:16.875731 2064791 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1219 06:12:16.883987 2064791 kubeadm.go:645] detected kubeadm config drift (will reconfigure cluster from new /var/tmp/minikube/kubeadm.yaml):
	-- stdout --
	--- /var/tmp/minikube/kubeadm.yaml	2025-12-19 05:57:41.594715365 +0000
	+++ /var/tmp/minikube/kubeadm.yaml.new	2025-12-19 06:12:15.784216685 +0000
	@@ -24,7 +24,7 @@
	   certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	   extraArgs:
	     - name: "enable-admission-plugins"
	-      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	+      value: "NamespaceAutoProvision"
	 controllerManager:
	   extraArgs:
	     - name: "allocate-node-cidrs"
	
	-- /stdout --
	I1219 06:12:16.884007 2064791 kubeadm.go:1161] stopping kube-system containers ...
	I1219 06:12:16.884018 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name: Namespaces:[kube-system]}
	I1219 06:12:16.884079 2064791 ssh_runner.go:195] Run: sudo -s eval "crictl --timeout=10s ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1219 06:12:16.914439 2064791 cri.go:92] found id: ""
	I1219 06:12:16.914509 2064791 ssh_runner.go:195] Run: sudo systemctl stop kubelet
	I1219 06:12:16.934128 2064791 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1219 06:12:16.942432 2064791 kubeadm.go:158] found existing configuration files:
	-rw------- 1 root root 5631 Dec 19 06:01 /etc/kubernetes/admin.conf
	-rw------- 1 root root 5640 Dec 19 06:01 /etc/kubernetes/controller-manager.conf
	-rw------- 1 root root 5672 Dec 19 06:01 /etc/kubernetes/kubelet.conf
	-rw------- 1 root root 5584 Dec 19 06:01 /etc/kubernetes/scheduler.conf
	
	I1219 06:12:16.942490 2064791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1219 06:12:16.950312 2064791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1219 06:12:16.957901 2064791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1219 06:12:16.957957 2064791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1219 06:12:16.965831 2064791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1219 06:12:16.973975 2064791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1219 06:12:16.974043 2064791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1219 06:12:16.981885 2064791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1219 06:12:16.989698 2064791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1219 06:12:16.989754 2064791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1219 06:12:16.997294 2064791 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1219 06:12:17.007519 2064791 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase certs all --config /var/tmp/minikube/kubeadm.yaml"
	I1219 06:12:17.060607 2064791 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml"
	I1219 06:12:18.829242 2064791 ssh_runner.go:235] Completed: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml": (1.768608779s)
	I1219 06:12:18.829304 2064791 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase kubelet-start --config /var/tmp/minikube/kubeadm.yaml"
	I1219 06:12:19.030093 2064791 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase control-plane all --config /var/tmp/minikube/kubeadm.yaml"
	I1219 06:12:19.096673 2064791 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase etcd local --config /var/tmp/minikube/kubeadm.yaml"
	I1219 06:12:19.143573 2064791 api_server.go:52] waiting for apiserver process to appear ...
	I1219 06:12:19.143640 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:19.643853 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:20.143947 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:20.643846 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:21.143937 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:21.644473 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:22.143865 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:22.643833 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:23.143826 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:23.644236 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:24.144477 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:24.643853 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:25.144064 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:25.643843 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:26.144063 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:26.644478 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:27.144296 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:27.644722 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:28.143844 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:28.643941 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:29.144786 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:29.644723 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:30.143963 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:30.644625 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:31.144751 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:31.643964 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:32.143826 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:32.644605 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:33.144436 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:33.644603 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:34.144557 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:34.643903 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:35.143857 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:35.644797 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:36.144741 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:36.644680 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:37.143872 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:37.643878 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:38.143792 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:38.643830 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:39.143968 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:39.644577 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:40.144282 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:40.643845 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:41.144575 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:41.644658 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:42.144382 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:42.643720 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:43.144137 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:43.644655 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:44.144500 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:44.643786 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:45.143923 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:45.643858 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:46.143983 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:46.644774 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:47.143914 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:47.644188 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:48.144565 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:48.644497 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:49.144557 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:49.644207 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:50.144291 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:50.644010 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:51.144161 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:51.644181 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:52.144353 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:52.644199 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:53.144589 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:53.644816 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:54.143901 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:54.643842 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:55.144726 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:55.643993 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:56.143828 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:56.644199 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:57.144581 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:57.644119 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:58.144258 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:58.643843 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:59.144738 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:12:59.644593 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:00.143794 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:00.643826 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:01.144533 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:01.644697 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:02.144643 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:02.644679 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:03.143834 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:03.644373 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:04.144520 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:04.643962 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:05.143832 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:05.644579 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:06.143874 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:06.643870 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:07.143982 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:07.644598 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:08.144752 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:08.643796 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:09.143951 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:09.644657 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:10.143751 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:10.643986 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:11.143782 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:11.644460 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:12.144317 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:12.644346 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:13.144670 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:13.643862 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:14.144550 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:14.644576 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:15.144673 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:15.644083 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:16.144204 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:16.644063 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:17.144669 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:17.643808 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:18.144068 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:18.643722 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:19.143899 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:19.143976 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:19.173119 2064791 cri.go:92] found id: ""
	I1219 06:13:19.173133 2064791 logs.go:282] 0 containers: []
	W1219 06:13:19.173141 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:19.173146 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:19.173204 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:19.207794 2064791 cri.go:92] found id: ""
	I1219 06:13:19.207807 2064791 logs.go:282] 0 containers: []
	W1219 06:13:19.207814 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:19.207819 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:19.207884 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:19.237060 2064791 cri.go:92] found id: ""
	I1219 06:13:19.237074 2064791 logs.go:282] 0 containers: []
	W1219 06:13:19.237081 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:19.237092 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:19.237154 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:19.262099 2064791 cri.go:92] found id: ""
	I1219 06:13:19.262114 2064791 logs.go:282] 0 containers: []
	W1219 06:13:19.262121 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:19.262126 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:19.262185 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:19.287540 2064791 cri.go:92] found id: ""
	I1219 06:13:19.287554 2064791 logs.go:282] 0 containers: []
	W1219 06:13:19.287561 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:19.287566 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:19.287632 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:19.315088 2064791 cri.go:92] found id: ""
	I1219 06:13:19.315102 2064791 logs.go:282] 0 containers: []
	W1219 06:13:19.315109 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:19.315115 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:19.315176 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:19.340777 2064791 cri.go:92] found id: ""
	I1219 06:13:19.340791 2064791 logs.go:282] 0 containers: []
	W1219 06:13:19.340798 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:19.340806 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:19.340818 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:19.357916 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:19.357932 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:19.426302 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:19.417866   10752 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:19.418382   10752 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:19.420072   10752 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:19.420408   10752 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:19.421854   10752 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:19.417866   10752 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:19.418382   10752 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:19.420072   10752 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:19.420408   10752 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:19.421854   10752 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:19.426313 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:19.426323 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:19.488347 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:19.488367 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:19.520211 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:19.520229 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:22.084930 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:22.095535 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:22.095602 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:22.122011 2064791 cri.go:92] found id: ""
	I1219 06:13:22.122025 2064791 logs.go:282] 0 containers: []
	W1219 06:13:22.122034 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:22.122059 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:22.122131 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:22.146880 2064791 cri.go:92] found id: ""
	I1219 06:13:22.146893 2064791 logs.go:282] 0 containers: []
	W1219 06:13:22.146900 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:22.146905 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:22.146975 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:22.176007 2064791 cri.go:92] found id: ""
	I1219 06:13:22.176021 2064791 logs.go:282] 0 containers: []
	W1219 06:13:22.176028 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:22.176033 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:22.176095 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:22.211343 2064791 cri.go:92] found id: ""
	I1219 06:13:22.211357 2064791 logs.go:282] 0 containers: []
	W1219 06:13:22.211365 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:22.211370 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:22.211429 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:22.235806 2064791 cri.go:92] found id: ""
	I1219 06:13:22.235829 2064791 logs.go:282] 0 containers: []
	W1219 06:13:22.235836 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:22.235841 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:22.235910 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:22.260858 2064791 cri.go:92] found id: ""
	I1219 06:13:22.260882 2064791 logs.go:282] 0 containers: []
	W1219 06:13:22.260888 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:22.260894 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:22.260954 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:22.285583 2064791 cri.go:92] found id: ""
	I1219 06:13:22.285597 2064791 logs.go:282] 0 containers: []
	W1219 06:13:22.285604 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:22.285613 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:22.285624 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:22.302970 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:22.302988 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:22.371208 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:22.362378   10854 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:22.363321   10854 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:22.365234   10854 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:22.365718   10854 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:22.367191   10854 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:22.362378   10854 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:22.363321   10854 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:22.365234   10854 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:22.365718   10854 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:22.367191   10854 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:22.371227 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:22.371238 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:22.433354 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:22.433373 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:22.468288 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:22.468305 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:25.028097 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:25.038266 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:25.038327 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:25.066109 2064791 cri.go:92] found id: ""
	I1219 06:13:25.066123 2064791 logs.go:282] 0 containers: []
	W1219 06:13:25.066130 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:25.066136 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:25.066199 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:25.091083 2064791 cri.go:92] found id: ""
	I1219 06:13:25.091096 2064791 logs.go:282] 0 containers: []
	W1219 06:13:25.091103 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:25.091109 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:25.091175 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:25.116729 2064791 cri.go:92] found id: ""
	I1219 06:13:25.116743 2064791 logs.go:282] 0 containers: []
	W1219 06:13:25.116750 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:25.116808 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:25.116890 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:25.145471 2064791 cri.go:92] found id: ""
	I1219 06:13:25.145485 2064791 logs.go:282] 0 containers: []
	W1219 06:13:25.145492 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:25.145497 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:25.145555 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:25.173780 2064791 cri.go:92] found id: ""
	I1219 06:13:25.173795 2064791 logs.go:282] 0 containers: []
	W1219 06:13:25.173801 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:25.173807 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:25.173876 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:25.202994 2064791 cri.go:92] found id: ""
	I1219 06:13:25.203008 2064791 logs.go:282] 0 containers: []
	W1219 06:13:25.203015 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:25.203021 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:25.203082 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:25.228548 2064791 cri.go:92] found id: ""
	I1219 06:13:25.228563 2064791 logs.go:282] 0 containers: []
	W1219 06:13:25.228570 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:25.228578 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:25.228590 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:25.260074 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:25.260090 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:25.316293 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:25.316311 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:25.333755 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:25.333771 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:25.395261 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:25.387061   10971 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:25.387481   10971 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:25.389091   10971 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:25.389803   10971 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:25.391447   10971 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:25.387061   10971 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:25.387481   10971 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:25.389091   10971 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:25.389803   10971 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:25.391447   10971 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:25.395273 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:25.395290 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:27.958003 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:27.968507 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:27.968571 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:27.994859 2064791 cri.go:92] found id: ""
	I1219 06:13:27.994872 2064791 logs.go:282] 0 containers: []
	W1219 06:13:27.994879 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:27.994884 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:27.994942 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:28.023716 2064791 cri.go:92] found id: ""
	I1219 06:13:28.023729 2064791 logs.go:282] 0 containers: []
	W1219 06:13:28.023736 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:28.023741 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:28.023807 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:28.048490 2064791 cri.go:92] found id: ""
	I1219 06:13:28.048504 2064791 logs.go:282] 0 containers: []
	W1219 06:13:28.048512 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:28.048517 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:28.048575 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:28.074305 2064791 cri.go:92] found id: ""
	I1219 06:13:28.074319 2064791 logs.go:282] 0 containers: []
	W1219 06:13:28.074326 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:28.074332 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:28.074392 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:28.098924 2064791 cri.go:92] found id: ""
	I1219 06:13:28.098938 2064791 logs.go:282] 0 containers: []
	W1219 06:13:28.098945 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:28.098950 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:28.099021 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:28.123000 2064791 cri.go:92] found id: ""
	I1219 06:13:28.123013 2064791 logs.go:282] 0 containers: []
	W1219 06:13:28.123021 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:28.123026 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:28.123091 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:28.150415 2064791 cri.go:92] found id: ""
	I1219 06:13:28.150428 2064791 logs.go:282] 0 containers: []
	W1219 06:13:28.150435 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:28.150443 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:28.150453 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:28.210763 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:28.210782 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:28.230191 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:28.230208 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:28.294389 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:28.286078   11066 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:28.286664   11066 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:28.288135   11066 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:28.288528   11066 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:28.290322   11066 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:28.286078   11066 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:28.286664   11066 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:28.288135   11066 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:28.288528   11066 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:28.290322   11066 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:28.294400 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:28.294411 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:28.357351 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:28.357371 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:30.888172 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:30.898614 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:30.898676 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:30.926377 2064791 cri.go:92] found id: ""
	I1219 06:13:30.926391 2064791 logs.go:282] 0 containers: []
	W1219 06:13:30.926398 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:30.926403 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:30.926458 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:30.950084 2064791 cri.go:92] found id: ""
	I1219 06:13:30.950097 2064791 logs.go:282] 0 containers: []
	W1219 06:13:30.950111 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:30.950117 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:30.950180 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:30.975713 2064791 cri.go:92] found id: ""
	I1219 06:13:30.975726 2064791 logs.go:282] 0 containers: []
	W1219 06:13:30.975734 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:30.975740 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:30.975798 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:31.012698 2064791 cri.go:92] found id: ""
	I1219 06:13:31.012712 2064791 logs.go:282] 0 containers: []
	W1219 06:13:31.012719 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:31.012725 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:31.012833 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:31.036945 2064791 cri.go:92] found id: ""
	I1219 06:13:31.036958 2064791 logs.go:282] 0 containers: []
	W1219 06:13:31.036965 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:31.036970 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:31.037028 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:31.062431 2064791 cri.go:92] found id: ""
	I1219 06:13:31.062445 2064791 logs.go:282] 0 containers: []
	W1219 06:13:31.062452 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:31.062457 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:31.062538 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:31.088075 2064791 cri.go:92] found id: ""
	I1219 06:13:31.088099 2064791 logs.go:282] 0 containers: []
	W1219 06:13:31.088106 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:31.088114 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:31.088123 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:31.143908 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:31.143928 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:31.164642 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:31.164661 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:31.241367 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:31.232734   11174 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:31.233582   11174 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:31.235302   11174 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:31.235873   11174 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:31.237342   11174 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:31.232734   11174 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:31.233582   11174 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:31.235302   11174 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:31.235873   11174 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:31.237342   11174 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:31.241378 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:31.241388 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:31.304583 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:31.304602 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:33.835874 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:33.847289 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:33.847350 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:33.874497 2064791 cri.go:92] found id: ""
	I1219 06:13:33.874511 2064791 logs.go:282] 0 containers: []
	W1219 06:13:33.874518 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:33.874523 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:33.874602 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:33.899113 2064791 cri.go:92] found id: ""
	I1219 06:13:33.899127 2064791 logs.go:282] 0 containers: []
	W1219 06:13:33.899134 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:33.899139 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:33.899198 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:33.927533 2064791 cri.go:92] found id: ""
	I1219 06:13:33.927546 2064791 logs.go:282] 0 containers: []
	W1219 06:13:33.927553 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:33.927559 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:33.927616 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:33.955150 2064791 cri.go:92] found id: ""
	I1219 06:13:33.955163 2064791 logs.go:282] 0 containers: []
	W1219 06:13:33.955170 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:33.955176 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:33.955233 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:33.979739 2064791 cri.go:92] found id: ""
	I1219 06:13:33.979753 2064791 logs.go:282] 0 containers: []
	W1219 06:13:33.979760 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:33.979765 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:33.979824 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:34.005264 2064791 cri.go:92] found id: ""
	I1219 06:13:34.005283 2064791 logs.go:282] 0 containers: []
	W1219 06:13:34.005291 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:34.005298 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:34.005375 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:34.031917 2064791 cri.go:92] found id: ""
	I1219 06:13:34.031931 2064791 logs.go:282] 0 containers: []
	W1219 06:13:34.031949 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:34.031958 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:34.031968 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:34.098907 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:34.098938 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:34.117494 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:34.117513 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:34.190606 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:34.181776   11272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:34.182594   11272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:34.184322   11272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:34.184963   11272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:34.186654   11272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:34.181776   11272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:34.182594   11272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:34.184322   11272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:34.184963   11272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:34.186654   11272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:34.190617 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:34.190630 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:34.260586 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:34.260607 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:36.792986 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:36.803226 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:36.803292 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:36.830943 2064791 cri.go:92] found id: ""
	I1219 06:13:36.830957 2064791 logs.go:282] 0 containers: []
	W1219 06:13:36.830964 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:36.830970 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:36.831029 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:36.856036 2064791 cri.go:92] found id: ""
	I1219 06:13:36.856051 2064791 logs.go:282] 0 containers: []
	W1219 06:13:36.856058 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:36.856063 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:36.856133 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:36.880807 2064791 cri.go:92] found id: ""
	I1219 06:13:36.880821 2064791 logs.go:282] 0 containers: []
	W1219 06:13:36.880828 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:36.880834 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:36.880893 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:36.904515 2064791 cri.go:92] found id: ""
	I1219 06:13:36.904529 2064791 logs.go:282] 0 containers: []
	W1219 06:13:36.904536 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:36.904542 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:36.904601 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:36.929517 2064791 cri.go:92] found id: ""
	I1219 06:13:36.929530 2064791 logs.go:282] 0 containers: []
	W1219 06:13:36.929538 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:36.929543 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:36.929615 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:36.953623 2064791 cri.go:92] found id: ""
	I1219 06:13:36.953636 2064791 logs.go:282] 0 containers: []
	W1219 06:13:36.953644 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:36.953650 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:36.953706 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:36.978769 2064791 cri.go:92] found id: ""
	I1219 06:13:36.978783 2064791 logs.go:282] 0 containers: []
	W1219 06:13:36.978790 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:36.978797 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:36.978807 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:37.036051 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:37.036072 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:37.053881 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:37.053898 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:37.117512 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:37.109460   11377 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:37.110050   11377 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:37.111554   11377 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:37.112021   11377 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:37.113512   11377 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:37.109460   11377 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:37.110050   11377 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:37.111554   11377 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:37.112021   11377 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:37.113512   11377 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:37.117523 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:37.117532 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:37.185580 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:37.185599 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:39.724185 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:39.735602 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:39.735670 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:39.760200 2064791 cri.go:92] found id: ""
	I1219 06:13:39.760214 2064791 logs.go:282] 0 containers: []
	W1219 06:13:39.760222 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:39.760227 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:39.760286 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:39.787416 2064791 cri.go:92] found id: ""
	I1219 06:13:39.787429 2064791 logs.go:282] 0 containers: []
	W1219 06:13:39.787437 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:39.787442 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:39.787505 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:39.811808 2064791 cri.go:92] found id: ""
	I1219 06:13:39.811822 2064791 logs.go:282] 0 containers: []
	W1219 06:13:39.811830 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:39.811836 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:39.811902 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:39.837072 2064791 cri.go:92] found id: ""
	I1219 06:13:39.837086 2064791 logs.go:282] 0 containers: []
	W1219 06:13:39.837093 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:39.837099 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:39.837200 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:39.866418 2064791 cri.go:92] found id: ""
	I1219 06:13:39.866432 2064791 logs.go:282] 0 containers: []
	W1219 06:13:39.866438 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:39.866444 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:39.866502 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:39.894744 2064791 cri.go:92] found id: ""
	I1219 06:13:39.894758 2064791 logs.go:282] 0 containers: []
	W1219 06:13:39.894765 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:39.894770 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:39.894833 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:39.921608 2064791 cri.go:92] found id: ""
	I1219 06:13:39.921622 2064791 logs.go:282] 0 containers: []
	W1219 06:13:39.921629 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:39.921643 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:39.921654 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:39.985200 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:39.985220 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:40.004064 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:40.004091 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:40.077619 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:40.069025   11483 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:40.069761   11483 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:40.071391   11483 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:40.072228   11483 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:40.073414   11483 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:40.069025   11483 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:40.069761   11483 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:40.071391   11483 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:40.072228   11483 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:40.073414   11483 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:40.077631 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:40.077641 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:40.142102 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:40.142127 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:42.682372 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:42.692608 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:42.692675 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:42.716749 2064791 cri.go:92] found id: ""
	I1219 06:13:42.716796 2064791 logs.go:282] 0 containers: []
	W1219 06:13:42.716804 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:42.716809 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:42.716888 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:42.740973 2064791 cri.go:92] found id: ""
	I1219 06:13:42.740986 2064791 logs.go:282] 0 containers: []
	W1219 06:13:42.740993 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:42.740999 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:42.741064 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:42.765521 2064791 cri.go:92] found id: ""
	I1219 06:13:42.765535 2064791 logs.go:282] 0 containers: []
	W1219 06:13:42.765543 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:42.765548 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:42.765607 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:42.790000 2064791 cri.go:92] found id: ""
	I1219 06:13:42.790015 2064791 logs.go:282] 0 containers: []
	W1219 06:13:42.790034 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:42.790040 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:42.790107 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:42.813722 2064791 cri.go:92] found id: ""
	I1219 06:13:42.813736 2064791 logs.go:282] 0 containers: []
	W1219 06:13:42.813743 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:42.813752 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:42.813814 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:42.838912 2064791 cri.go:92] found id: ""
	I1219 06:13:42.838926 2064791 logs.go:282] 0 containers: []
	W1219 06:13:42.838934 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:42.838939 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:42.839002 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:42.867044 2064791 cri.go:92] found id: ""
	I1219 06:13:42.867058 2064791 logs.go:282] 0 containers: []
	W1219 06:13:42.867065 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:42.867073 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:42.867083 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:42.923612 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:42.923632 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:42.941274 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:42.941293 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:43.008705 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:42.998396   11587 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:42.999004   11587 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:43.000468   11587 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:43.000919   11587 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:43.003777   11587 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:42.998396   11587 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:42.999004   11587 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:43.000468   11587 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:43.000919   11587 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:43.003777   11587 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:43.008716 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:43.008736 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:43.074629 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:43.074654 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:45.608725 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:45.619043 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:45.619107 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:45.645025 2064791 cri.go:92] found id: ""
	I1219 06:13:45.645041 2064791 logs.go:282] 0 containers: []
	W1219 06:13:45.645049 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:45.645054 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:45.645120 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:45.671700 2064791 cri.go:92] found id: ""
	I1219 06:13:45.671716 2064791 logs.go:282] 0 containers: []
	W1219 06:13:45.671723 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:45.671735 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:45.671797 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:45.701839 2064791 cri.go:92] found id: ""
	I1219 06:13:45.701864 2064791 logs.go:282] 0 containers: []
	W1219 06:13:45.701872 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:45.701878 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:45.701947 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:45.731819 2064791 cri.go:92] found id: ""
	I1219 06:13:45.731834 2064791 logs.go:282] 0 containers: []
	W1219 06:13:45.731841 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:45.731847 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:45.731910 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:45.758372 2064791 cri.go:92] found id: ""
	I1219 06:13:45.758386 2064791 logs.go:282] 0 containers: []
	W1219 06:13:45.758393 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:45.758399 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:45.758464 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:45.784713 2064791 cri.go:92] found id: ""
	I1219 06:13:45.784727 2064791 logs.go:282] 0 containers: []
	W1219 06:13:45.784734 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:45.784739 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:45.784829 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:45.811948 2064791 cri.go:92] found id: ""
	I1219 06:13:45.811962 2064791 logs.go:282] 0 containers: []
	W1219 06:13:45.811969 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:45.811977 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:45.811987 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:45.868299 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:45.868317 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:45.886032 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:45.886049 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:45.952733 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:45.944285   11690 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:45.944957   11690 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:45.946453   11690 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:45.946859   11690 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:45.948306   11690 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:45.944285   11690 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:45.944957   11690 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:45.946453   11690 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:45.946859   11690 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:45.948306   11690 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:45.952743 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:45.952783 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:46.020565 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:46.020588 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:48.550865 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:48.561408 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:48.561483 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:48.586776 2064791 cri.go:92] found id: ""
	I1219 06:13:48.586790 2064791 logs.go:282] 0 containers: []
	W1219 06:13:48.586797 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:48.586802 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:48.586864 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:48.612701 2064791 cri.go:92] found id: ""
	I1219 06:13:48.612715 2064791 logs.go:282] 0 containers: []
	W1219 06:13:48.612722 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:48.612727 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:48.612808 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:48.637097 2064791 cri.go:92] found id: ""
	I1219 06:13:48.637110 2064791 logs.go:282] 0 containers: []
	W1219 06:13:48.637118 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:48.637124 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:48.637183 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:48.662701 2064791 cri.go:92] found id: ""
	I1219 06:13:48.662715 2064791 logs.go:282] 0 containers: []
	W1219 06:13:48.662722 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:48.662727 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:48.662785 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:48.690291 2064791 cri.go:92] found id: ""
	I1219 06:13:48.690304 2064791 logs.go:282] 0 containers: []
	W1219 06:13:48.690311 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:48.690316 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:48.690376 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:48.715968 2064791 cri.go:92] found id: ""
	I1219 06:13:48.715983 2064791 logs.go:282] 0 containers: []
	W1219 06:13:48.715990 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:48.715995 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:48.716059 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:48.741069 2064791 cri.go:92] found id: ""
	I1219 06:13:48.741082 2064791 logs.go:282] 0 containers: []
	W1219 06:13:48.741090 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:48.741097 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:48.741113 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:48.796842 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:48.796863 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:48.814146 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:48.814166 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:48.879995 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:48.871133   11794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:48.871771   11794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:48.873597   11794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:48.874180   11794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:48.875997   11794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:48.871133   11794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:48.871771   11794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:48.873597   11794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:48.874180   11794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:48.875997   11794 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:48.880005 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:48.880017 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:48.943211 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:48.943231 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:51.472961 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:51.483727 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:51.483805 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:51.513333 2064791 cri.go:92] found id: ""
	I1219 06:13:51.513347 2064791 logs.go:282] 0 containers: []
	W1219 06:13:51.513354 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:51.513360 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:51.513426 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:51.539359 2064791 cri.go:92] found id: ""
	I1219 06:13:51.539373 2064791 logs.go:282] 0 containers: []
	W1219 06:13:51.539380 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:51.539392 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:51.539449 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:51.564730 2064791 cri.go:92] found id: ""
	I1219 06:13:51.564743 2064791 logs.go:282] 0 containers: []
	W1219 06:13:51.564750 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:51.564794 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:51.564855 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:51.590117 2064791 cri.go:92] found id: ""
	I1219 06:13:51.590138 2064791 logs.go:282] 0 containers: []
	W1219 06:13:51.590145 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:51.590150 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:51.590210 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:51.614688 2064791 cri.go:92] found id: ""
	I1219 06:13:51.614702 2064791 logs.go:282] 0 containers: []
	W1219 06:13:51.614709 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:51.614715 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:51.614778 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:51.638492 2064791 cri.go:92] found id: ""
	I1219 06:13:51.638508 2064791 logs.go:282] 0 containers: []
	W1219 06:13:51.638518 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:51.638524 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:51.638597 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:51.666861 2064791 cri.go:92] found id: ""
	I1219 06:13:51.666874 2064791 logs.go:282] 0 containers: []
	W1219 06:13:51.666881 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:51.666888 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:51.666899 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:51.731208 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:51.723294   11895 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:51.723881   11895 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:51.725543   11895 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:51.725958   11895 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:51.727478   11895 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:51.723294   11895 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:51.723881   11895 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:51.725543   11895 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:51.725958   11895 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:51.727478   11895 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:51.731218 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:51.731228 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:51.793354 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:51.793375 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:51.819761 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:51.819784 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:51.877976 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:51.877996 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:54.395396 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:54.405788 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:54.405848 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:54.444121 2064791 cri.go:92] found id: ""
	I1219 06:13:54.444151 2064791 logs.go:282] 0 containers: []
	W1219 06:13:54.444159 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:54.444164 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:54.444243 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:54.471038 2064791 cri.go:92] found id: ""
	I1219 06:13:54.471064 2064791 logs.go:282] 0 containers: []
	W1219 06:13:54.471072 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:54.471077 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:54.471160 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:54.500364 2064791 cri.go:92] found id: ""
	I1219 06:13:54.500377 2064791 logs.go:282] 0 containers: []
	W1219 06:13:54.500385 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:54.500390 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:54.500450 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:54.525919 2064791 cri.go:92] found id: ""
	I1219 06:13:54.525934 2064791 logs.go:282] 0 containers: []
	W1219 06:13:54.525941 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:54.525962 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:54.526021 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:54.551211 2064791 cri.go:92] found id: ""
	I1219 06:13:54.551225 2064791 logs.go:282] 0 containers: []
	W1219 06:13:54.551232 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:54.551239 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:54.551310 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:54.577841 2064791 cri.go:92] found id: ""
	I1219 06:13:54.577854 2064791 logs.go:282] 0 containers: []
	W1219 06:13:54.577861 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:54.577866 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:54.577931 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:54.602636 2064791 cri.go:92] found id: ""
	I1219 06:13:54.602650 2064791 logs.go:282] 0 containers: []
	W1219 06:13:54.602656 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:54.602664 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:54.602675 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:54.619644 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:54.619661 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:54.682901 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:54.674199   12005 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:54.675104   12005 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:54.676718   12005 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:54.677329   12005 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:54.678988   12005 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:54.674199   12005 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:54.675104   12005 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:54.676718   12005 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:54.677329   12005 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:54.678988   12005 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:54.682911 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:54.682921 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:54.749370 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:54.749393 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:13:54.780731 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:54.780747 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:57.338712 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:13:57.349237 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:13:57.349299 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:13:57.378161 2064791 cri.go:92] found id: ""
	I1219 06:13:57.378175 2064791 logs.go:282] 0 containers: []
	W1219 06:13:57.378181 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:13:57.378187 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:13:57.378247 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:13:57.403073 2064791 cri.go:92] found id: ""
	I1219 06:13:57.403087 2064791 logs.go:282] 0 containers: []
	W1219 06:13:57.403094 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:13:57.403099 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:13:57.403160 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:13:57.431222 2064791 cri.go:92] found id: ""
	I1219 06:13:57.431236 2064791 logs.go:282] 0 containers: []
	W1219 06:13:57.431244 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:13:57.431249 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:13:57.431306 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:13:57.466943 2064791 cri.go:92] found id: ""
	I1219 06:13:57.466957 2064791 logs.go:282] 0 containers: []
	W1219 06:13:57.466964 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:13:57.466969 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:13:57.467027 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:13:57.493181 2064791 cri.go:92] found id: ""
	I1219 06:13:57.493194 2064791 logs.go:282] 0 containers: []
	W1219 06:13:57.493201 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:13:57.493206 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:13:57.493265 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:13:57.517521 2064791 cri.go:92] found id: ""
	I1219 06:13:57.517534 2064791 logs.go:282] 0 containers: []
	W1219 06:13:57.517543 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:13:57.517549 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:13:57.517606 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:13:57.546827 2064791 cri.go:92] found id: ""
	I1219 06:13:57.546841 2064791 logs.go:282] 0 containers: []
	W1219 06:13:57.546848 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:13:57.546856 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:13:57.546865 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:13:57.603521 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:13:57.603540 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:13:57.620971 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:13:57.620988 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:13:57.687316 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:13:57.678838   12107 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:57.679404   12107 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:57.680987   12107 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:57.681452   12107 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:57.683006   12107 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:13:57.678838   12107 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:57.679404   12107 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:57.680987   12107 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:57.681452   12107 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:13:57.683006   12107 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:13:57.687326 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:13:57.687336 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:13:57.759758 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:13:57.759787 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:00.293478 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:00.313120 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:00.313205 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:00.349921 2064791 cri.go:92] found id: ""
	I1219 06:14:00.349938 2064791 logs.go:282] 0 containers: []
	W1219 06:14:00.349947 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:00.349953 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:00.350031 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:00.381005 2064791 cri.go:92] found id: ""
	I1219 06:14:00.381022 2064791 logs.go:282] 0 containers: []
	W1219 06:14:00.381031 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:00.381037 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:00.381113 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:00.415179 2064791 cri.go:92] found id: ""
	I1219 06:14:00.415194 2064791 logs.go:282] 0 containers: []
	W1219 06:14:00.415202 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:00.415207 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:00.415268 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:00.455068 2064791 cri.go:92] found id: ""
	I1219 06:14:00.455084 2064791 logs.go:282] 0 containers: []
	W1219 06:14:00.455090 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:00.455096 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:00.455170 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:00.488360 2064791 cri.go:92] found id: ""
	I1219 06:14:00.488374 2064791 logs.go:282] 0 containers: []
	W1219 06:14:00.488382 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:00.488387 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:00.488450 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:00.514399 2064791 cri.go:92] found id: ""
	I1219 06:14:00.514414 2064791 logs.go:282] 0 containers: []
	W1219 06:14:00.514420 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:00.514426 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:00.514485 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:00.544639 2064791 cri.go:92] found id: ""
	I1219 06:14:00.544655 2064791 logs.go:282] 0 containers: []
	W1219 06:14:00.544662 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:00.544670 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:00.544683 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:00.562442 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:00.562459 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:00.630032 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:00.620824   12211 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:00.621603   12211 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:00.623426   12211 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:00.624012   12211 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:00.625740   12211 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:00.620824   12211 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:00.621603   12211 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:00.623426   12211 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:00.624012   12211 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:00.625740   12211 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:00.630043 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:00.630053 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:00.693056 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:00.693075 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:00.724344 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:00.724362 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:03.282407 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:03.292404 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:03.292463 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:03.322285 2064791 cri.go:92] found id: ""
	I1219 06:14:03.322298 2064791 logs.go:282] 0 containers: []
	W1219 06:14:03.322305 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:03.322310 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:03.322377 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:03.345824 2064791 cri.go:92] found id: ""
	I1219 06:14:03.345838 2064791 logs.go:282] 0 containers: []
	W1219 06:14:03.345846 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:03.345852 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:03.345913 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:03.369194 2064791 cri.go:92] found id: ""
	I1219 06:14:03.369208 2064791 logs.go:282] 0 containers: []
	W1219 06:14:03.369214 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:03.369220 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:03.369280 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:03.393453 2064791 cri.go:92] found id: ""
	I1219 06:14:03.393467 2064791 logs.go:282] 0 containers: []
	W1219 06:14:03.393474 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:03.393479 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:03.393538 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:03.423067 2064791 cri.go:92] found id: ""
	I1219 06:14:03.423082 2064791 logs.go:282] 0 containers: []
	W1219 06:14:03.423088 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:03.423093 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:03.423149 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:03.449404 2064791 cri.go:92] found id: ""
	I1219 06:14:03.449418 2064791 logs.go:282] 0 containers: []
	W1219 06:14:03.449424 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:03.449430 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:03.449491 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:03.483320 2064791 cri.go:92] found id: ""
	I1219 06:14:03.483334 2064791 logs.go:282] 0 containers: []
	W1219 06:14:03.483342 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:03.483349 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:03.483360 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:03.546816 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:03.538361   12314 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:03.539133   12314 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:03.540745   12314 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:03.541423   12314 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:03.542995   12314 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:03.538361   12314 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:03.539133   12314 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:03.540745   12314 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:03.541423   12314 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:03.542995   12314 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:03.546828 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:03.546840 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:03.608924 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:03.608943 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:03.640931 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:03.640947 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:03.698583 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:03.698601 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:06.217289 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:06.228468 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:06.228538 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:06.254249 2064791 cri.go:92] found id: ""
	I1219 06:14:06.254264 2064791 logs.go:282] 0 containers: []
	W1219 06:14:06.254271 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:06.254276 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:06.254335 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:06.278663 2064791 cri.go:92] found id: ""
	I1219 06:14:06.278677 2064791 logs.go:282] 0 containers: []
	W1219 06:14:06.278685 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:06.278691 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:06.278751 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:06.304128 2064791 cri.go:92] found id: ""
	I1219 06:14:06.304143 2064791 logs.go:282] 0 containers: []
	W1219 06:14:06.304150 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:06.304162 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:06.304224 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:06.330238 2064791 cri.go:92] found id: ""
	I1219 06:14:06.330252 2064791 logs.go:282] 0 containers: []
	W1219 06:14:06.330259 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:06.330265 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:06.330326 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:06.354219 2064791 cri.go:92] found id: ""
	I1219 06:14:06.354234 2064791 logs.go:282] 0 containers: []
	W1219 06:14:06.354241 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:06.354246 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:06.354307 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:06.382747 2064791 cri.go:92] found id: ""
	I1219 06:14:06.382762 2064791 logs.go:282] 0 containers: []
	W1219 06:14:06.382769 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:06.382777 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:06.382837 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:06.421656 2064791 cri.go:92] found id: ""
	I1219 06:14:06.421670 2064791 logs.go:282] 0 containers: []
	W1219 06:14:06.421677 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:06.421685 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:06.421694 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:06.498836 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:06.498857 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:06.531636 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:06.531653 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:06.590085 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:06.590106 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:06.608226 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:06.608243 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:06.675159 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:06.667629   12435 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:06.668042   12435 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:06.669562   12435 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:06.669908   12435 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:06.671371   12435 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:06.667629   12435 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:06.668042   12435 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:06.669562   12435 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:06.669908   12435 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:06.671371   12435 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:09.176005 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:09.186839 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:09.186916 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:09.211786 2064791 cri.go:92] found id: ""
	I1219 06:14:09.211800 2064791 logs.go:282] 0 containers: []
	W1219 06:14:09.211807 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:09.211812 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:09.211873 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:09.240415 2064791 cri.go:92] found id: ""
	I1219 06:14:09.240429 2064791 logs.go:282] 0 containers: []
	W1219 06:14:09.240436 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:09.240441 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:09.240503 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:09.266183 2064791 cri.go:92] found id: ""
	I1219 06:14:09.266197 2064791 logs.go:282] 0 containers: []
	W1219 06:14:09.266204 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:09.266209 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:09.266269 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:09.294483 2064791 cri.go:92] found id: ""
	I1219 06:14:09.294497 2064791 logs.go:282] 0 containers: []
	W1219 06:14:09.294504 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:09.294509 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:09.294572 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:09.319997 2064791 cri.go:92] found id: ""
	I1219 06:14:09.320011 2064791 logs.go:282] 0 containers: []
	W1219 06:14:09.320019 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:09.320024 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:09.320113 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:09.346661 2064791 cri.go:92] found id: ""
	I1219 06:14:09.346675 2064791 logs.go:282] 0 containers: []
	W1219 06:14:09.346683 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:09.346688 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:09.346746 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:09.371664 2064791 cri.go:92] found id: ""
	I1219 06:14:09.371690 2064791 logs.go:282] 0 containers: []
	W1219 06:14:09.371698 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:09.371706 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:09.371717 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:09.389515 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:09.389534 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:09.473775 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:09.465363   12518 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:09.465921   12518 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:09.467541   12518 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:09.468108   12518 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:09.469733   12518 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:09.465363   12518 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:09.465921   12518 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:09.467541   12518 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:09.468108   12518 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:09.469733   12518 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:09.473785 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:09.473796 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:09.541712 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:09.541736 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:09.577440 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:09.577456 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:12.133722 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:12.144214 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:12.144277 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:12.170929 2064791 cri.go:92] found id: ""
	I1219 06:14:12.170944 2064791 logs.go:282] 0 containers: []
	W1219 06:14:12.170951 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:12.170956 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:12.171026 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:12.195988 2064791 cri.go:92] found id: ""
	I1219 06:14:12.196002 2064791 logs.go:282] 0 containers: []
	W1219 06:14:12.196008 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:12.196014 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:12.196073 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:12.221254 2064791 cri.go:92] found id: ""
	I1219 06:14:12.221269 2064791 logs.go:282] 0 containers: []
	W1219 06:14:12.221276 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:12.221281 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:12.221346 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:12.246403 2064791 cri.go:92] found id: ""
	I1219 06:14:12.246417 2064791 logs.go:282] 0 containers: []
	W1219 06:14:12.246424 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:12.246430 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:12.246491 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:12.271124 2064791 cri.go:92] found id: ""
	I1219 06:14:12.271139 2064791 logs.go:282] 0 containers: []
	W1219 06:14:12.271145 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:12.271150 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:12.271209 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:12.296180 2064791 cri.go:92] found id: ""
	I1219 06:14:12.296194 2064791 logs.go:282] 0 containers: []
	W1219 06:14:12.296211 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:12.296216 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:12.296284 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:12.322520 2064791 cri.go:92] found id: ""
	I1219 06:14:12.322534 2064791 logs.go:282] 0 containers: []
	W1219 06:14:12.322541 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:12.322548 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:12.322559 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:12.349890 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:12.349907 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:12.407189 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:12.407210 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:12.426453 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:12.426469 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:12.499487 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:12.491549   12644 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:12.491978   12644 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:12.493525   12644 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:12.493875   12644 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:12.495402   12644 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:12.491549   12644 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:12.491978   12644 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:12.493525   12644 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:12.493875   12644 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:12.495402   12644 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:12.499498 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:12.499509 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:15.067160 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:15.078543 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:15.078611 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:15.104838 2064791 cri.go:92] found id: ""
	I1219 06:14:15.104852 2064791 logs.go:282] 0 containers: []
	W1219 06:14:15.104860 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:15.104865 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:15.104933 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:15.130179 2064791 cri.go:92] found id: ""
	I1219 06:14:15.130194 2064791 logs.go:282] 0 containers: []
	W1219 06:14:15.130201 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:15.130207 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:15.130268 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:15.156134 2064791 cri.go:92] found id: ""
	I1219 06:14:15.156147 2064791 logs.go:282] 0 containers: []
	W1219 06:14:15.156154 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:15.156159 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:15.156221 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:15.182543 2064791 cri.go:92] found id: ""
	I1219 06:14:15.182557 2064791 logs.go:282] 0 containers: []
	W1219 06:14:15.182564 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:15.182570 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:15.182631 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:15.212350 2064791 cri.go:92] found id: ""
	I1219 06:14:15.212364 2064791 logs.go:282] 0 containers: []
	W1219 06:14:15.212371 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:15.212376 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:15.212437 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:15.239403 2064791 cri.go:92] found id: ""
	I1219 06:14:15.239418 2064791 logs.go:282] 0 containers: []
	W1219 06:14:15.239425 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:15.239430 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:15.239490 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:15.265288 2064791 cri.go:92] found id: ""
	I1219 06:14:15.265303 2064791 logs.go:282] 0 containers: []
	W1219 06:14:15.265310 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:15.265318 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:15.265328 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:15.322825 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:15.322845 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:15.339946 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:15.339963 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:15.406282 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:15.394886   12732 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:15.395459   12732 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:15.397067   12732 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:15.397414   12732 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:15.399914   12732 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:15.394886   12732 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:15.395459   12732 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:15.397067   12732 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:15.397414   12732 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:15.399914   12732 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:15.406294 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:15.406305 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:15.481322 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:15.481342 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:18.011054 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:18.022305 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:18.022367 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:18.048236 2064791 cri.go:92] found id: ""
	I1219 06:14:18.048250 2064791 logs.go:282] 0 containers: []
	W1219 06:14:18.048257 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:18.048262 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:18.048326 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:18.075811 2064791 cri.go:92] found id: ""
	I1219 06:14:18.075825 2064791 logs.go:282] 0 containers: []
	W1219 06:14:18.075833 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:18.075839 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:18.075911 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:18.101578 2064791 cri.go:92] found id: ""
	I1219 06:14:18.101593 2064791 logs.go:282] 0 containers: []
	W1219 06:14:18.101601 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:18.101607 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:18.101668 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:18.127312 2064791 cri.go:92] found id: ""
	I1219 06:14:18.127327 2064791 logs.go:282] 0 containers: []
	W1219 06:14:18.127335 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:18.127341 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:18.127400 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:18.153616 2064791 cri.go:92] found id: ""
	I1219 06:14:18.153630 2064791 logs.go:282] 0 containers: []
	W1219 06:14:18.153637 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:18.153642 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:18.153702 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:18.177937 2064791 cri.go:92] found id: ""
	I1219 06:14:18.177959 2064791 logs.go:282] 0 containers: []
	W1219 06:14:18.177967 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:18.177972 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:18.178044 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:18.211563 2064791 cri.go:92] found id: ""
	I1219 06:14:18.211576 2064791 logs.go:282] 0 containers: []
	W1219 06:14:18.211583 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:18.211591 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:18.211614 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:18.270162 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:18.270182 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:18.288230 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:18.288247 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:18.351713 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:18.343431   12840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:18.343955   12840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:18.345552   12840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:18.346118   12840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:18.347689   12840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:18.343431   12840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:18.343955   12840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:18.345552   12840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:18.346118   12840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:18.347689   12840 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:18.351723 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:18.351734 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:18.415359 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:18.415379 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:20.949383 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:20.959444 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:20.959504 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:20.984028 2064791 cri.go:92] found id: ""
	I1219 06:14:20.984041 2064791 logs.go:282] 0 containers: []
	W1219 06:14:20.984048 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:20.984054 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:20.984114 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:21.011129 2064791 cri.go:92] found id: ""
	I1219 06:14:21.011145 2064791 logs.go:282] 0 containers: []
	W1219 06:14:21.011153 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:21.011159 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:21.011232 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:21.036500 2064791 cri.go:92] found id: ""
	I1219 06:14:21.036515 2064791 logs.go:282] 0 containers: []
	W1219 06:14:21.036522 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:21.036528 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:21.036593 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:21.061075 2064791 cri.go:92] found id: ""
	I1219 06:14:21.061092 2064791 logs.go:282] 0 containers: []
	W1219 06:14:21.061099 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:21.061106 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:21.061164 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:21.086516 2064791 cri.go:92] found id: ""
	I1219 06:14:21.086532 2064791 logs.go:282] 0 containers: []
	W1219 06:14:21.086539 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:21.086545 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:21.086606 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:21.110771 2064791 cri.go:92] found id: ""
	I1219 06:14:21.110791 2064791 logs.go:282] 0 containers: []
	W1219 06:14:21.110798 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:21.110804 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:21.110861 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:21.135223 2064791 cri.go:92] found id: ""
	I1219 06:14:21.135237 2064791 logs.go:282] 0 containers: []
	W1219 06:14:21.135244 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:21.135253 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:21.135262 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:21.198022 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:21.198041 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:21.227058 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:21.227074 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:21.285376 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:21.285395 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:21.302978 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:21.302996 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:21.371361 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:21.363586   12960 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:21.364188   12960 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:21.365313   12960 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:21.365885   12960 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:21.367496   12960 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:21.363586   12960 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:21.364188   12960 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:21.365313   12960 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:21.365885   12960 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:21.367496   12960 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:23.871625 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:23.882253 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:23.882315 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:23.911700 2064791 cri.go:92] found id: ""
	I1219 06:14:23.911715 2064791 logs.go:282] 0 containers: []
	W1219 06:14:23.911722 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:23.911727 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:23.911792 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:23.940526 2064791 cri.go:92] found id: ""
	I1219 06:14:23.940542 2064791 logs.go:282] 0 containers: []
	W1219 06:14:23.940549 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:23.940554 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:23.940613 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:23.965505 2064791 cri.go:92] found id: ""
	I1219 06:14:23.965520 2064791 logs.go:282] 0 containers: []
	W1219 06:14:23.965527 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:23.965532 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:23.965592 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:23.990160 2064791 cri.go:92] found id: ""
	I1219 06:14:23.990174 2064791 logs.go:282] 0 containers: []
	W1219 06:14:23.990180 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:23.990186 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:23.990244 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:24.020703 2064791 cri.go:92] found id: ""
	I1219 06:14:24.020718 2064791 logs.go:282] 0 containers: []
	W1219 06:14:24.020731 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:24.020736 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:24.020818 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:24.045597 2064791 cri.go:92] found id: ""
	I1219 06:14:24.045611 2064791 logs.go:282] 0 containers: []
	W1219 06:14:24.045619 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:24.045625 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:24.045687 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:24.070650 2064791 cri.go:92] found id: ""
	I1219 06:14:24.070665 2064791 logs.go:282] 0 containers: []
	W1219 06:14:24.070673 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:24.070681 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:24.070692 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:24.088118 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:24.088135 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:24.154756 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:24.145796   13051 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:24.146235   13051 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:24.147863   13051 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:24.148542   13051 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:24.150277   13051 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:24.145796   13051 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:24.146235   13051 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:24.147863   13051 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:24.148542   13051 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:24.150277   13051 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:24.154766 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:24.154777 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:24.222682 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:24.222712 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:24.251017 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:24.251036 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:26.810547 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:26.821800 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:26.821882 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:26.851618 2064791 cri.go:92] found id: ""
	I1219 06:14:26.851632 2064791 logs.go:282] 0 containers: []
	W1219 06:14:26.851639 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:26.851644 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:26.851701 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:26.881247 2064791 cri.go:92] found id: ""
	I1219 06:14:26.881261 2064791 logs.go:282] 0 containers: []
	W1219 06:14:26.881268 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:26.881273 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:26.881331 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:26.906685 2064791 cri.go:92] found id: ""
	I1219 06:14:26.906698 2064791 logs.go:282] 0 containers: []
	W1219 06:14:26.906705 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:26.906710 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:26.906769 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:26.930800 2064791 cri.go:92] found id: ""
	I1219 06:14:26.930814 2064791 logs.go:282] 0 containers: []
	W1219 06:14:26.930821 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:26.930826 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:26.930886 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:26.955923 2064791 cri.go:92] found id: ""
	I1219 06:14:26.955936 2064791 logs.go:282] 0 containers: []
	W1219 06:14:26.955943 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:26.955949 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:26.956007 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:26.981009 2064791 cri.go:92] found id: ""
	I1219 06:14:26.981023 2064791 logs.go:282] 0 containers: []
	W1219 06:14:26.981030 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:26.981036 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:26.981100 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:27.008093 2064791 cri.go:92] found id: ""
	I1219 06:14:27.008107 2064791 logs.go:282] 0 containers: []
	W1219 06:14:27.008115 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:27.008123 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:27.008133 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:27.064465 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:27.064484 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:27.082027 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:27.082043 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:27.147050 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:27.138327   13160 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:27.139038   13160 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:27.140978   13160 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:27.141575   13160 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:27.143181   13160 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:27.138327   13160 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:27.139038   13160 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:27.140978   13160 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:27.141575   13160 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:27.143181   13160 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:27.147061 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:27.147072 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:27.209843 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:27.209866 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:29.744581 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:29.755392 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:29.755453 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:29.786638 2064791 cri.go:92] found id: ""
	I1219 06:14:29.786652 2064791 logs.go:282] 0 containers: []
	W1219 06:14:29.786659 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:29.786664 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:29.786724 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:29.812212 2064791 cri.go:92] found id: ""
	I1219 06:14:29.812225 2064791 logs.go:282] 0 containers: []
	W1219 06:14:29.812232 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:29.812237 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:29.812296 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:29.836877 2064791 cri.go:92] found id: ""
	I1219 06:14:29.836892 2064791 logs.go:282] 0 containers: []
	W1219 06:14:29.836899 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:29.836905 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:29.836964 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:29.861702 2064791 cri.go:92] found id: ""
	I1219 06:14:29.861715 2064791 logs.go:282] 0 containers: []
	W1219 06:14:29.861722 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:29.861727 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:29.861786 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:29.885680 2064791 cri.go:92] found id: ""
	I1219 06:14:29.885694 2064791 logs.go:282] 0 containers: []
	W1219 06:14:29.885703 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:29.885708 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:29.885770 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:29.910947 2064791 cri.go:92] found id: ""
	I1219 06:14:29.910961 2064791 logs.go:282] 0 containers: []
	W1219 06:14:29.910968 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:29.910973 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:29.911034 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:29.935050 2064791 cri.go:92] found id: ""
	I1219 06:14:29.935065 2064791 logs.go:282] 0 containers: []
	W1219 06:14:29.935072 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:29.935080 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:29.935090 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:29.998135 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:29.998156 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:30.043603 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:30.043622 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:30.105767 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:30.105788 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:30.123694 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:30.123713 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:30.194778 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:30.185852   13278 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:30.186558   13278 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:30.188234   13278 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:30.188924   13278 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:30.190530   13278 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:30.185852   13278 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:30.186558   13278 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:30.188234   13278 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:30.188924   13278 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:30.190530   13278 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:32.694996 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:32.706674 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:32.706732 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:32.732252 2064791 cri.go:92] found id: ""
	I1219 06:14:32.732268 2064791 logs.go:282] 0 containers: []
	W1219 06:14:32.732276 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:32.732282 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:32.732344 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:32.758653 2064791 cri.go:92] found id: ""
	I1219 06:14:32.758667 2064791 logs.go:282] 0 containers: []
	W1219 06:14:32.758674 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:32.758679 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:32.758739 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:32.784000 2064791 cri.go:92] found id: ""
	I1219 06:14:32.784015 2064791 logs.go:282] 0 containers: []
	W1219 06:14:32.784032 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:32.784037 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:32.784104 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:32.812817 2064791 cri.go:92] found id: ""
	I1219 06:14:32.812840 2064791 logs.go:282] 0 containers: []
	W1219 06:14:32.812847 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:32.812856 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:32.812927 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:32.838382 2064791 cri.go:92] found id: ""
	I1219 06:14:32.838396 2064791 logs.go:282] 0 containers: []
	W1219 06:14:32.838404 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:32.838409 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:32.838470 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:32.865911 2064791 cri.go:92] found id: ""
	I1219 06:14:32.865929 2064791 logs.go:282] 0 containers: []
	W1219 06:14:32.865937 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:32.865944 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:32.866010 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:32.890355 2064791 cri.go:92] found id: ""
	I1219 06:14:32.890369 2064791 logs.go:282] 0 containers: []
	W1219 06:14:32.890376 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:32.890384 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:32.890394 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:32.946230 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:32.946249 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:32.964055 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:32.964071 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:33.030318 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:33.021305   13371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:33.022105   13371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:33.023970   13371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:33.024656   13371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:33.026568   13371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:33.021305   13371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:33.022105   13371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:33.023970   13371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:33.024656   13371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:33.026568   13371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:33.030328 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:33.030341 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:33.097167 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:33.097188 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:35.628021 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:35.638217 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:35.638279 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:35.675187 2064791 cri.go:92] found id: ""
	I1219 06:14:35.675209 2064791 logs.go:282] 0 containers: []
	W1219 06:14:35.675217 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:35.675223 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:35.675283 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:35.703303 2064791 cri.go:92] found id: ""
	I1219 06:14:35.703317 2064791 logs.go:282] 0 containers: []
	W1219 06:14:35.703324 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:35.703329 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:35.703387 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:35.736481 2064791 cri.go:92] found id: ""
	I1219 06:14:35.736495 2064791 logs.go:282] 0 containers: []
	W1219 06:14:35.736502 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:35.736507 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:35.736571 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:35.761459 2064791 cri.go:92] found id: ""
	I1219 06:14:35.761472 2064791 logs.go:282] 0 containers: []
	W1219 06:14:35.761479 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:35.761485 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:35.761542 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:35.785228 2064791 cri.go:92] found id: ""
	I1219 06:14:35.785242 2064791 logs.go:282] 0 containers: []
	W1219 06:14:35.785249 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:35.785255 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:35.785317 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:35.811887 2064791 cri.go:92] found id: ""
	I1219 06:14:35.811901 2064791 logs.go:282] 0 containers: []
	W1219 06:14:35.811908 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:35.811913 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:35.811971 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:35.837382 2064791 cri.go:92] found id: ""
	I1219 06:14:35.837395 2064791 logs.go:282] 0 containers: []
	W1219 06:14:35.837402 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:35.837410 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:35.837420 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:35.893642 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:35.893663 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:35.911983 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:35.911999 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:35.979649 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:35.971161   13476 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:35.971848   13476 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:35.973453   13476 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:35.974018   13476 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:35.975638   13476 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:35.971161   13476 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:35.971848   13476 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:35.973453   13476 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:35.974018   13476 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:35.975638   13476 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:35.979659 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:35.979669 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:36.041989 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:36.042008 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:38.571113 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:38.581755 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:38.581829 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:38.606952 2064791 cri.go:92] found id: ""
	I1219 06:14:38.606977 2064791 logs.go:282] 0 containers: []
	W1219 06:14:38.606985 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:38.607000 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:38.607062 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:38.641457 2064791 cri.go:92] found id: ""
	I1219 06:14:38.641470 2064791 logs.go:282] 0 containers: []
	W1219 06:14:38.641477 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:38.641482 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:38.641544 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:38.675510 2064791 cri.go:92] found id: ""
	I1219 06:14:38.675523 2064791 logs.go:282] 0 containers: []
	W1219 06:14:38.675530 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:38.675536 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:38.675597 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:38.701888 2064791 cri.go:92] found id: ""
	I1219 06:14:38.701902 2064791 logs.go:282] 0 containers: []
	W1219 06:14:38.701909 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:38.701915 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:38.701975 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:38.728277 2064791 cri.go:92] found id: ""
	I1219 06:14:38.728290 2064791 logs.go:282] 0 containers: []
	W1219 06:14:38.728299 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:38.728305 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:38.728365 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:38.755404 2064791 cri.go:92] found id: ""
	I1219 06:14:38.755418 2064791 logs.go:282] 0 containers: []
	W1219 06:14:38.755427 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:38.755433 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:38.755495 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:38.778883 2064791 cri.go:92] found id: ""
	I1219 06:14:38.778896 2064791 logs.go:282] 0 containers: []
	W1219 06:14:38.778903 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:38.778911 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:38.778921 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:38.807023 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:38.807039 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:38.867198 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:38.867217 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:38.885283 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:38.885299 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:38.953980 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:38.945374   13594 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:38.946114   13594 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:38.947740   13594 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:38.948287   13594 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:38.949494   13594 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:38.945374   13594 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:38.946114   13594 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:38.947740   13594 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:38.948287   13594 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:38.949494   13594 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:38.953990 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:38.954002 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:41.516935 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:41.527938 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:41.528001 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:41.553242 2064791 cri.go:92] found id: ""
	I1219 06:14:41.553256 2064791 logs.go:282] 0 containers: []
	W1219 06:14:41.553263 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:41.553268 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:41.553333 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:41.579295 2064791 cri.go:92] found id: ""
	I1219 06:14:41.579309 2064791 logs.go:282] 0 containers: []
	W1219 06:14:41.579316 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:41.579321 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:41.579385 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:41.605144 2064791 cri.go:92] found id: ""
	I1219 06:14:41.605157 2064791 logs.go:282] 0 containers: []
	W1219 06:14:41.605164 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:41.605169 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:41.605237 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:41.629732 2064791 cri.go:92] found id: ""
	I1219 06:14:41.629747 2064791 logs.go:282] 0 containers: []
	W1219 06:14:41.629754 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:41.629760 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:41.629822 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:41.659346 2064791 cri.go:92] found id: ""
	I1219 06:14:41.659361 2064791 logs.go:282] 0 containers: []
	W1219 06:14:41.659368 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:41.659373 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:41.659432 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:41.690573 2064791 cri.go:92] found id: ""
	I1219 06:14:41.690598 2064791 logs.go:282] 0 containers: []
	W1219 06:14:41.690606 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:41.690612 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:41.690681 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:41.732984 2064791 cri.go:92] found id: ""
	I1219 06:14:41.732998 2064791 logs.go:282] 0 containers: []
	W1219 06:14:41.733006 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:41.733013 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:41.733023 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:41.795851 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:41.795871 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:41.825041 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:41.825056 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:41.886639 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:41.886659 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:41.904083 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:41.904100 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:41.971851 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:41.963475   13701 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:41.964169   13701 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:41.965774   13701 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:41.966365   13701 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:41.967995   13701 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:41.963475   13701 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:41.964169   13701 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:41.965774   13701 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:41.966365   13701 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:41.967995   13701 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:44.473271 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:44.483164 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:44.483222 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:44.511046 2064791 cri.go:92] found id: ""
	I1219 06:14:44.511060 2064791 logs.go:282] 0 containers: []
	W1219 06:14:44.511067 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:44.511072 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:44.511131 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:44.536197 2064791 cri.go:92] found id: ""
	I1219 06:14:44.536211 2064791 logs.go:282] 0 containers: []
	W1219 06:14:44.536219 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:44.536224 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:44.536283 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:44.562337 2064791 cri.go:92] found id: ""
	I1219 06:14:44.562354 2064791 logs.go:282] 0 containers: []
	W1219 06:14:44.562360 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:44.562366 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:44.562474 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:44.587553 2064791 cri.go:92] found id: ""
	I1219 06:14:44.587567 2064791 logs.go:282] 0 containers: []
	W1219 06:14:44.587574 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:44.587579 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:44.587637 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:44.614987 2064791 cri.go:92] found id: ""
	I1219 06:14:44.615000 2064791 logs.go:282] 0 containers: []
	W1219 06:14:44.615007 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:44.615012 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:44.615070 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:44.638714 2064791 cri.go:92] found id: ""
	I1219 06:14:44.638727 2064791 logs.go:282] 0 containers: []
	W1219 06:14:44.638734 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:44.638740 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:44.638800 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:44.688380 2064791 cri.go:92] found id: ""
	I1219 06:14:44.688393 2064791 logs.go:282] 0 containers: []
	W1219 06:14:44.688401 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:44.688409 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:44.688419 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:44.752969 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:44.752989 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:44.770407 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:44.770424 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:44.837420 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:44.829128   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:44.829667   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:44.831337   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:44.831916   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:44.833576   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:44.829128   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:44.829667   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:44.831337   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:44.831916   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:44.833576   13793 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:44.837430 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:44.837440 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:44.899538 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:44.899557 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:47.426650 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:47.436749 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:47.436827 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:47.461986 2064791 cri.go:92] found id: ""
	I1219 06:14:47.462000 2064791 logs.go:282] 0 containers: []
	W1219 06:14:47.462007 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:47.462012 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:47.462071 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:47.487738 2064791 cri.go:92] found id: ""
	I1219 06:14:47.487765 2064791 logs.go:282] 0 containers: []
	W1219 06:14:47.487785 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:47.487790 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:47.487934 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:47.517657 2064791 cri.go:92] found id: ""
	I1219 06:14:47.517671 2064791 logs.go:282] 0 containers: []
	W1219 06:14:47.517678 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:47.517683 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:47.517741 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:47.541725 2064791 cri.go:92] found id: ""
	I1219 06:14:47.541740 2064791 logs.go:282] 0 containers: []
	W1219 06:14:47.541747 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:47.541752 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:47.541811 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:47.566613 2064791 cri.go:92] found id: ""
	I1219 06:14:47.566627 2064791 logs.go:282] 0 containers: []
	W1219 06:14:47.566634 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:47.566640 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:47.566698 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:47.593670 2064791 cri.go:92] found id: ""
	I1219 06:14:47.593683 2064791 logs.go:282] 0 containers: []
	W1219 06:14:47.593690 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:47.593705 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:47.593778 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:47.617501 2064791 cri.go:92] found id: ""
	I1219 06:14:47.617516 2064791 logs.go:282] 0 containers: []
	W1219 06:14:47.617523 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:47.617530 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:47.617544 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:47.699175 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:47.685609   13884 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:47.686090   13884 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:47.691538   13884 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:47.692419   13884 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:47.694959   13884 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:47.685609   13884 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:47.686090   13884 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:47.691538   13884 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:47.692419   13884 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:47.694959   13884 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:47.699185 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:47.699195 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:47.763955 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:47.763976 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:47.796195 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:47.796212 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:47.855457 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:47.855477 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:50.373913 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:50.384678 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:50.384743 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:50.409292 2064791 cri.go:92] found id: ""
	I1219 06:14:50.409305 2064791 logs.go:282] 0 containers: []
	W1219 06:14:50.409314 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:50.409319 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:50.409380 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:50.434622 2064791 cri.go:92] found id: ""
	I1219 06:14:50.434637 2064791 logs.go:282] 0 containers: []
	W1219 06:14:50.434644 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:50.434649 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:50.434708 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:50.462727 2064791 cri.go:92] found id: ""
	I1219 06:14:50.462741 2064791 logs.go:282] 0 containers: []
	W1219 06:14:50.462748 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:50.462754 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:50.462818 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:50.487565 2064791 cri.go:92] found id: ""
	I1219 06:14:50.487578 2064791 logs.go:282] 0 containers: []
	W1219 06:14:50.487586 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:50.487593 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:50.487655 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:50.514337 2064791 cri.go:92] found id: ""
	I1219 06:14:50.514351 2064791 logs.go:282] 0 containers: []
	W1219 06:14:50.514358 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:50.514363 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:50.514428 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:50.538808 2064791 cri.go:92] found id: ""
	I1219 06:14:50.538822 2064791 logs.go:282] 0 containers: []
	W1219 06:14:50.538829 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:50.538835 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:50.538900 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:50.562833 2064791 cri.go:92] found id: ""
	I1219 06:14:50.562847 2064791 logs.go:282] 0 containers: []
	W1219 06:14:50.562854 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:50.562862 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:50.562872 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:50.630176 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:50.621836   13988 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:50.622705   13988 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:50.624224   13988 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:50.624675   13988 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:50.626153   13988 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:50.621836   13988 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:50.622705   13988 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:50.624224   13988 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:50.624675   13988 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:50.626153   13988 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:50.630187 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:50.630197 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:50.701427 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:50.701449 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:50.729581 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:50.729602 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:50.786455 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:50.786479 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:53.304847 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:53.315504 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:53.315564 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:53.340157 2064791 cri.go:92] found id: ""
	I1219 06:14:53.340172 2064791 logs.go:282] 0 containers: []
	W1219 06:14:53.340179 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:53.340184 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:53.340242 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:53.368950 2064791 cri.go:92] found id: ""
	I1219 06:14:53.368964 2064791 logs.go:282] 0 containers: []
	W1219 06:14:53.368971 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:53.368976 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:53.369037 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:53.393336 2064791 cri.go:92] found id: ""
	I1219 06:14:53.393349 2064791 logs.go:282] 0 containers: []
	W1219 06:14:53.393356 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:53.393362 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:53.393419 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:53.417054 2064791 cri.go:92] found id: ""
	I1219 06:14:53.417069 2064791 logs.go:282] 0 containers: []
	W1219 06:14:53.417085 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:53.417091 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:53.417163 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:53.440932 2064791 cri.go:92] found id: ""
	I1219 06:14:53.440946 2064791 logs.go:282] 0 containers: []
	W1219 06:14:53.440953 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:53.440958 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:53.441016 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:53.464424 2064791 cri.go:92] found id: ""
	I1219 06:14:53.464437 2064791 logs.go:282] 0 containers: []
	W1219 06:14:53.464444 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:53.464449 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:53.464509 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:53.488126 2064791 cri.go:92] found id: ""
	I1219 06:14:53.488143 2064791 logs.go:282] 0 containers: []
	W1219 06:14:53.488150 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:53.488158 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:53.488168 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:53.558644 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:53.550747   14093 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:53.551416   14093 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:53.553060   14093 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:53.553386   14093 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:53.554859   14093 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:53.550747   14093 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:53.551416   14093 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:53.553060   14093 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:53.553386   14093 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:53.554859   14093 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:53.558655 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:53.558665 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:53.622193 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:53.622214 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:53.650744 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:53.650759 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:53.710733 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:53.710750 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:56.228553 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:56.238967 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:56.239030 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:56.262851 2064791 cri.go:92] found id: ""
	I1219 06:14:56.262864 2064791 logs.go:282] 0 containers: []
	W1219 06:14:56.262872 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:56.262877 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:56.262943 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:56.287030 2064791 cri.go:92] found id: ""
	I1219 06:14:56.287043 2064791 logs.go:282] 0 containers: []
	W1219 06:14:56.287050 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:56.287056 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:56.287118 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:56.312417 2064791 cri.go:92] found id: ""
	I1219 06:14:56.312430 2064791 logs.go:282] 0 containers: []
	W1219 06:14:56.312437 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:56.312442 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:56.312505 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:56.350599 2064791 cri.go:92] found id: ""
	I1219 06:14:56.350613 2064791 logs.go:282] 0 containers: []
	W1219 06:14:56.350622 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:56.350627 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:56.350686 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:56.374515 2064791 cri.go:92] found id: ""
	I1219 06:14:56.374528 2064791 logs.go:282] 0 containers: []
	W1219 06:14:56.374535 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:56.374540 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:56.374596 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:56.399267 2064791 cri.go:92] found id: ""
	I1219 06:14:56.399281 2064791 logs.go:282] 0 containers: []
	W1219 06:14:56.399288 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:56.399293 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:56.399351 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:56.424503 2064791 cri.go:92] found id: ""
	I1219 06:14:56.424516 2064791 logs.go:282] 0 containers: []
	W1219 06:14:56.424523 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:56.424531 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:56.424541 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:56.490954 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:56.490973 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:56.522329 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:56.522345 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:56.582279 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:56.582298 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:56.599656 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:56.599673 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:56.665092 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:56.656833   14220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:56.657522   14220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:56.659110   14220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:56.659433   14220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:56.661032   14220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:56.656833   14220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:56.657522   14220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:56.659110   14220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:56.659433   14220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:56.661032   14220 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:14:59.165361 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:14:59.178705 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:14:59.178767 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:14:59.207415 2064791 cri.go:92] found id: ""
	I1219 06:14:59.207429 2064791 logs.go:282] 0 containers: []
	W1219 06:14:59.207436 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:14:59.207441 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:14:59.207499 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:14:59.231912 2064791 cri.go:92] found id: ""
	I1219 06:14:59.231926 2064791 logs.go:282] 0 containers: []
	W1219 06:14:59.231934 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:14:59.231939 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:14:59.232000 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:14:59.258822 2064791 cri.go:92] found id: ""
	I1219 06:14:59.258836 2064791 logs.go:282] 0 containers: []
	W1219 06:14:59.258843 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:14:59.258848 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:14:59.258909 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:14:59.283942 2064791 cri.go:92] found id: ""
	I1219 06:14:59.283955 2064791 logs.go:282] 0 containers: []
	W1219 06:14:59.283963 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:14:59.283968 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:14:59.284026 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:14:59.311236 2064791 cri.go:92] found id: ""
	I1219 06:14:59.311249 2064791 logs.go:282] 0 containers: []
	W1219 06:14:59.311256 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:14:59.311262 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:14:59.311322 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:14:59.336239 2064791 cri.go:92] found id: ""
	I1219 06:14:59.336253 2064791 logs.go:282] 0 containers: []
	W1219 06:14:59.336260 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:14:59.336267 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:14:59.336325 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:14:59.360395 2064791 cri.go:92] found id: ""
	I1219 06:14:59.360409 2064791 logs.go:282] 0 containers: []
	W1219 06:14:59.360417 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:14:59.360425 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:14:59.360435 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:14:59.423580 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:14:59.423601 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:14:59.453489 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:14:59.453506 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:14:59.512842 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:14:59.512862 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:14:59.530149 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:14:59.530168 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:14:59.593869 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:14:59.584731   14326 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:59.585448   14326 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:59.587088   14326 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:59.587675   14326 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:59.589312   14326 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:14:59.584731   14326 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:59.585448   14326 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:59.587088   14326 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:59.587675   14326 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:14:59.589312   14326 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:02.094126 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:02.104778 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:02.104839 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:02.129446 2064791 cri.go:92] found id: ""
	I1219 06:15:02.129462 2064791 logs.go:282] 0 containers: []
	W1219 06:15:02.129469 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:02.129474 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:02.129539 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:02.154875 2064791 cri.go:92] found id: ""
	I1219 06:15:02.154889 2064791 logs.go:282] 0 containers: []
	W1219 06:15:02.154896 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:02.154901 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:02.155006 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:02.180628 2064791 cri.go:92] found id: ""
	I1219 06:15:02.180643 2064791 logs.go:282] 0 containers: []
	W1219 06:15:02.180650 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:02.180655 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:02.180716 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:02.205447 2064791 cri.go:92] found id: ""
	I1219 06:15:02.205462 2064791 logs.go:282] 0 containers: []
	W1219 06:15:02.205469 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:02.205475 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:02.205543 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:02.233523 2064791 cri.go:92] found id: ""
	I1219 06:15:02.233537 2064791 logs.go:282] 0 containers: []
	W1219 06:15:02.233544 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:02.233550 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:02.233610 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:02.259723 2064791 cri.go:92] found id: ""
	I1219 06:15:02.259738 2064791 logs.go:282] 0 containers: []
	W1219 06:15:02.259744 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:02.259750 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:02.259813 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:02.289093 2064791 cri.go:92] found id: ""
	I1219 06:15:02.289108 2064791 logs.go:282] 0 containers: []
	W1219 06:15:02.289115 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:02.289123 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:02.289133 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:02.347737 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:02.347758 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:02.365547 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:02.365564 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:02.433606 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:02.424090   14417 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:02.425124   14417 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:02.425822   14417 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:02.427646   14417 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:02.428231   14417 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:02.424090   14417 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:02.425124   14417 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:02.425822   14417 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:02.427646   14417 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:02.428231   14417 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:02.433616 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:02.433627 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:02.497677 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:02.497697 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:05.027685 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:05.037775 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:05.037845 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:05.062132 2064791 cri.go:92] found id: ""
	I1219 06:15:05.062146 2064791 logs.go:282] 0 containers: []
	W1219 06:15:05.062152 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:05.062157 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:05.062230 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:05.087233 2064791 cri.go:92] found id: ""
	I1219 06:15:05.087247 2064791 logs.go:282] 0 containers: []
	W1219 06:15:05.087254 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:05.087259 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:05.087318 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:05.116140 2064791 cri.go:92] found id: ""
	I1219 06:15:05.116155 2064791 logs.go:282] 0 containers: []
	W1219 06:15:05.116162 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:05.116167 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:05.116229 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:05.141158 2064791 cri.go:92] found id: ""
	I1219 06:15:05.141171 2064791 logs.go:282] 0 containers: []
	W1219 06:15:05.141179 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:05.141184 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:05.141255 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:05.166033 2064791 cri.go:92] found id: ""
	I1219 06:15:05.166046 2064791 logs.go:282] 0 containers: []
	W1219 06:15:05.166053 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:05.166059 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:05.166118 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:05.189930 2064791 cri.go:92] found id: ""
	I1219 06:15:05.189943 2064791 logs.go:282] 0 containers: []
	W1219 06:15:05.189951 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:05.189956 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:05.190013 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:05.217697 2064791 cri.go:92] found id: ""
	I1219 06:15:05.217711 2064791 logs.go:282] 0 containers: []
	W1219 06:15:05.217718 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:05.217726 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:05.217737 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:05.273609 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:05.273629 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:05.291274 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:05.291291 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:05.355137 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:05.346587   14523 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:05.347461   14523 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:05.349261   14523 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:05.349738   14523 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:05.351338   14523 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:05.346587   14523 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:05.347461   14523 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:05.349261   14523 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:05.349738   14523 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:05.351338   14523 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:05.355147 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:05.355158 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:05.418376 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:05.418395 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:07.946932 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:07.957404 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:07.957465 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:07.983257 2064791 cri.go:92] found id: ""
	I1219 06:15:07.983270 2064791 logs.go:282] 0 containers: []
	W1219 06:15:07.983277 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:07.983283 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:07.983344 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:08.010747 2064791 cri.go:92] found id: ""
	I1219 06:15:08.010762 2064791 logs.go:282] 0 containers: []
	W1219 06:15:08.010770 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:08.010776 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:08.010842 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:08.040479 2064791 cri.go:92] found id: ""
	I1219 06:15:08.040493 2064791 logs.go:282] 0 containers: []
	W1219 06:15:08.040500 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:08.040506 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:08.040566 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:08.067147 2064791 cri.go:92] found id: ""
	I1219 06:15:08.067162 2064791 logs.go:282] 0 containers: []
	W1219 06:15:08.067169 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:08.067175 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:08.067238 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:08.096399 2064791 cri.go:92] found id: ""
	I1219 06:15:08.096415 2064791 logs.go:282] 0 containers: []
	W1219 06:15:08.096422 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:08.096430 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:08.096492 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:08.120924 2064791 cri.go:92] found id: ""
	I1219 06:15:08.120938 2064791 logs.go:282] 0 containers: []
	W1219 06:15:08.120945 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:08.120951 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:08.121010 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:08.145044 2064791 cri.go:92] found id: ""
	I1219 06:15:08.145057 2064791 logs.go:282] 0 containers: []
	W1219 06:15:08.145064 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:08.145072 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:08.145082 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:08.201643 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:08.201664 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:08.219150 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:08.219166 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:08.285100 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:08.276907   14628 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:08.277509   14628 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:08.279021   14628 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:08.279543   14628 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:08.281080   14628 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:08.276907   14628 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:08.277509   14628 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:08.279021   14628 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:08.279543   14628 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:08.281080   14628 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:08.285118 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:08.285129 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:08.349440 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:08.349460 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:10.878798 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:10.888854 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:10.888917 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:10.920436 2064791 cri.go:92] found id: ""
	I1219 06:15:10.920450 2064791 logs.go:282] 0 containers: []
	W1219 06:15:10.920457 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:10.920463 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:10.920536 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:10.951229 2064791 cri.go:92] found id: ""
	I1219 06:15:10.951243 2064791 logs.go:282] 0 containers: []
	W1219 06:15:10.951252 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:10.951258 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:10.951315 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:10.980039 2064791 cri.go:92] found id: ""
	I1219 06:15:10.980054 2064791 logs.go:282] 0 containers: []
	W1219 06:15:10.980061 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:10.980066 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:10.980126 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:11.008250 2064791 cri.go:92] found id: ""
	I1219 06:15:11.008265 2064791 logs.go:282] 0 containers: []
	W1219 06:15:11.008273 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:11.008278 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:11.008346 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:11.033554 2064791 cri.go:92] found id: ""
	I1219 06:15:11.033568 2064791 logs.go:282] 0 containers: []
	W1219 06:15:11.033575 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:11.033580 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:11.033641 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:11.058115 2064791 cri.go:92] found id: ""
	I1219 06:15:11.058128 2064791 logs.go:282] 0 containers: []
	W1219 06:15:11.058135 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:11.058141 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:11.058219 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:11.083222 2064791 cri.go:92] found id: ""
	I1219 06:15:11.083236 2064791 logs.go:282] 0 containers: []
	W1219 06:15:11.083242 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:11.083250 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:11.083260 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:11.146681 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:11.146702 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:11.176028 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:11.176047 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:11.233340 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:11.233361 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:11.250941 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:11.250957 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:11.315829 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:11.306797   14744 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:11.307411   14744 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:11.309263   14744 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:11.309846   14744 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:11.311388   14744 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:11.306797   14744 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:11.307411   14744 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:11.309263   14744 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:11.309846   14744 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:11.311388   14744 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:13.816114 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:13.826460 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:13.826527 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:13.850958 2064791 cri.go:92] found id: ""
	I1219 06:15:13.850973 2064791 logs.go:282] 0 containers: []
	W1219 06:15:13.850980 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:13.850988 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:13.851048 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:13.879518 2064791 cri.go:92] found id: ""
	I1219 06:15:13.879538 2064791 logs.go:282] 0 containers: []
	W1219 06:15:13.879546 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:13.879551 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:13.879611 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:13.917876 2064791 cri.go:92] found id: ""
	I1219 06:15:13.917890 2064791 logs.go:282] 0 containers: []
	W1219 06:15:13.917897 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:13.917902 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:13.917965 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:13.957039 2064791 cri.go:92] found id: ""
	I1219 06:15:13.957053 2064791 logs.go:282] 0 containers: []
	W1219 06:15:13.957060 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:13.957065 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:13.957126 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:13.992398 2064791 cri.go:92] found id: ""
	I1219 06:15:13.992412 2064791 logs.go:282] 0 containers: []
	W1219 06:15:13.992419 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:13.992424 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:13.992486 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:14.019915 2064791 cri.go:92] found id: ""
	I1219 06:15:14.019930 2064791 logs.go:282] 0 containers: []
	W1219 06:15:14.019938 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:14.019943 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:14.020004 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:14.045800 2064791 cri.go:92] found id: ""
	I1219 06:15:14.045815 2064791 logs.go:282] 0 containers: []
	W1219 06:15:14.045822 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:14.045830 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:14.045841 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:14.102453 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:14.102472 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:14.120093 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:14.120110 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:14.183187 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:14.175289   14833 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:14.175797   14833 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:14.177338   14833 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:14.177777   14833 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:14.179247   14833 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:14.175289   14833 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:14.175797   14833 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:14.177338   14833 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:14.177777   14833 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:14.179247   14833 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:14.183198 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:14.183209 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:14.246652 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:14.246673 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:16.780257 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:16.790741 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:16.790802 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:16.815777 2064791 cri.go:92] found id: ""
	I1219 06:15:16.815802 2064791 logs.go:282] 0 containers: []
	W1219 06:15:16.815809 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:16.815815 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:16.815890 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:16.841105 2064791 cri.go:92] found id: ""
	I1219 06:15:16.841124 2064791 logs.go:282] 0 containers: []
	W1219 06:15:16.841142 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:16.841148 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:16.841217 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:16.866795 2064791 cri.go:92] found id: ""
	I1219 06:15:16.866820 2064791 logs.go:282] 0 containers: []
	W1219 06:15:16.866827 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:16.866833 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:16.866910 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:16.892692 2064791 cri.go:92] found id: ""
	I1219 06:15:16.892706 2064791 logs.go:282] 0 containers: []
	W1219 06:15:16.892713 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:16.892718 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:16.892803 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:16.926258 2064791 cri.go:92] found id: ""
	I1219 06:15:16.926272 2064791 logs.go:282] 0 containers: []
	W1219 06:15:16.926279 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:16.926285 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:16.926346 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:16.955968 2064791 cri.go:92] found id: ""
	I1219 06:15:16.955982 2064791 logs.go:282] 0 containers: []
	W1219 06:15:16.955989 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:16.955995 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:16.956057 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:16.985158 2064791 cri.go:92] found id: ""
	I1219 06:15:16.985172 2064791 logs.go:282] 0 containers: []
	W1219 06:15:16.985179 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:16.985186 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:16.985196 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:17.043879 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:17.043899 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:17.061599 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:17.061616 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:17.125509 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:17.117153   14937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:17.117733   14937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:17.119480   14937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:17.119896   14937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:17.121399   14937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:17.117153   14937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:17.117733   14937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:17.119480   14937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:17.119896   14937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:17.121399   14937 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:17.125519 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:17.125531 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:17.189339 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:17.189359 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:19.721517 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:19.731846 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:19.731916 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:19.758133 2064791 cri.go:92] found id: ""
	I1219 06:15:19.758147 2064791 logs.go:282] 0 containers: []
	W1219 06:15:19.758154 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:19.758160 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:19.758228 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:19.787023 2064791 cri.go:92] found id: ""
	I1219 06:15:19.787037 2064791 logs.go:282] 0 containers: []
	W1219 06:15:19.787045 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:19.787059 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:19.787123 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:19.813855 2064791 cri.go:92] found id: ""
	I1219 06:15:19.813869 2064791 logs.go:282] 0 containers: []
	W1219 06:15:19.813876 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:19.813881 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:19.813944 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:19.838418 2064791 cri.go:92] found id: ""
	I1219 06:15:19.838432 2064791 logs.go:282] 0 containers: []
	W1219 06:15:19.838439 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:19.838444 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:19.838508 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:19.863215 2064791 cri.go:92] found id: ""
	I1219 06:15:19.863229 2064791 logs.go:282] 0 containers: []
	W1219 06:15:19.863240 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:19.863246 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:19.863307 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:19.887732 2064791 cri.go:92] found id: ""
	I1219 06:15:19.887746 2064791 logs.go:282] 0 containers: []
	W1219 06:15:19.887753 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:19.887758 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:19.887815 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:19.930174 2064791 cri.go:92] found id: ""
	I1219 06:15:19.930192 2064791 logs.go:282] 0 containers: []
	W1219 06:15:19.930200 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:19.930208 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:19.930222 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:19.949025 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:19.949041 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:20.022932 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:20.013526   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:20.014350   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:20.016121   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:20.016702   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:20.018424   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:20.013526   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:20.014350   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:20.016121   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:20.016702   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:20.018424   15040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:20.022944 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:20.022955 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:20.088903 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:20.088924 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:20.117778 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:20.117794 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:22.677536 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:22.687468 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:22.687536 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:22.712714 2064791 cri.go:92] found id: ""
	I1219 06:15:22.712728 2064791 logs.go:282] 0 containers: []
	W1219 06:15:22.712736 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:22.712741 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:22.712816 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:22.736316 2064791 cri.go:92] found id: ""
	I1219 06:15:22.736329 2064791 logs.go:282] 0 containers: []
	W1219 06:15:22.736336 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:22.736341 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:22.736401 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:22.762215 2064791 cri.go:92] found id: ""
	I1219 06:15:22.762229 2064791 logs.go:282] 0 containers: []
	W1219 06:15:22.762236 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:22.762241 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:22.762309 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:22.787061 2064791 cri.go:92] found id: ""
	I1219 06:15:22.787074 2064791 logs.go:282] 0 containers: []
	W1219 06:15:22.787081 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:22.787086 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:22.787146 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:22.814937 2064791 cri.go:92] found id: ""
	I1219 06:15:22.814951 2064791 logs.go:282] 0 containers: []
	W1219 06:15:22.814957 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:22.814963 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:22.815033 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:22.842839 2064791 cri.go:92] found id: ""
	I1219 06:15:22.842853 2064791 logs.go:282] 0 containers: []
	W1219 06:15:22.842859 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:22.842865 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:22.842923 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:22.869394 2064791 cri.go:92] found id: ""
	I1219 06:15:22.869407 2064791 logs.go:282] 0 containers: []
	W1219 06:15:22.869413 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:22.869421 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:22.869430 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:22.926492 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:22.926510 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:22.944210 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:22.944232 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:23.013797 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:23.003493   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:23.005070   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:23.006087   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:23.007846   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:23.008447   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:23.003493   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:23.005070   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:23.006087   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:23.007846   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:23.008447   15149 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:23.013807 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:23.013821 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:23.081279 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:23.081306 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:25.612946 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:25.622887 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:25.622947 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:25.656332 2064791 cri.go:92] found id: ""
	I1219 06:15:25.656346 2064791 logs.go:282] 0 containers: []
	W1219 06:15:25.656353 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:25.656359 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:25.656425 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:25.680887 2064791 cri.go:92] found id: ""
	I1219 06:15:25.680901 2064791 logs.go:282] 0 containers: []
	W1219 06:15:25.680908 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:25.680913 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:25.680981 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:25.705508 2064791 cri.go:92] found id: ""
	I1219 06:15:25.705523 2064791 logs.go:282] 0 containers: []
	W1219 06:15:25.705531 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:25.705536 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:25.705598 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:25.729434 2064791 cri.go:92] found id: ""
	I1219 06:15:25.729447 2064791 logs.go:282] 0 containers: []
	W1219 06:15:25.729454 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:25.729459 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:25.729517 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:25.755351 2064791 cri.go:92] found id: ""
	I1219 06:15:25.755365 2064791 logs.go:282] 0 containers: []
	W1219 06:15:25.755381 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:25.755388 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:25.755449 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:25.782840 2064791 cri.go:92] found id: ""
	I1219 06:15:25.782854 2064791 logs.go:282] 0 containers: []
	W1219 06:15:25.782861 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:25.782866 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:25.782929 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:25.811125 2064791 cri.go:92] found id: ""
	I1219 06:15:25.811139 2064791 logs.go:282] 0 containers: []
	W1219 06:15:25.811155 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:25.811165 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:25.811175 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:25.867579 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:25.867601 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:25.884977 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:25.884996 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:25.983099 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:25.974919   15250 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:25.975374   15250 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:25.977076   15250 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:25.977555   15250 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:25.979165   15250 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:25.974919   15250 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:25.975374   15250 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:25.977076   15250 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:25.977555   15250 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:25.979165   15250 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:25.983110 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:25.983119 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:26.047515 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:26.047534 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:28.576468 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:28.586983 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:28.587044 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:28.612243 2064791 cri.go:92] found id: ""
	I1219 06:15:28.612257 2064791 logs.go:282] 0 containers: []
	W1219 06:15:28.612264 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:28.612270 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:28.612331 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:28.637476 2064791 cri.go:92] found id: ""
	I1219 06:15:28.637490 2064791 logs.go:282] 0 containers: []
	W1219 06:15:28.637496 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:28.637502 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:28.637564 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:28.662778 2064791 cri.go:92] found id: ""
	I1219 06:15:28.662792 2064791 logs.go:282] 0 containers: []
	W1219 06:15:28.662800 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:28.662805 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:28.662864 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:28.687078 2064791 cri.go:92] found id: ""
	I1219 06:15:28.687091 2064791 logs.go:282] 0 containers: []
	W1219 06:15:28.687098 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:28.687105 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:28.687166 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:28.712552 2064791 cri.go:92] found id: ""
	I1219 06:15:28.712566 2064791 logs.go:282] 0 containers: []
	W1219 06:15:28.712572 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:28.712577 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:28.712646 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:28.738798 2064791 cri.go:92] found id: ""
	I1219 06:15:28.738812 2064791 logs.go:282] 0 containers: []
	W1219 06:15:28.738819 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:28.738824 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:28.738881 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:28.767309 2064791 cri.go:92] found id: ""
	I1219 06:15:28.767324 2064791 logs.go:282] 0 containers: []
	W1219 06:15:28.767340 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:28.767349 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:28.767358 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:28.827489 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:28.827509 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:28.844978 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:28.844994 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:28.915425 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:28.906948   15355 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:28.907778   15355 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:28.909411   15355 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:28.909881   15355 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:28.911514   15355 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:28.906948   15355 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:28.907778   15355 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:28.909411   15355 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:28.909881   15355 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:28.911514   15355 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:28.915435 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:28.915445 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:28.980721 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:28.980742 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:31.518692 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:31.528660 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:31.528719 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:31.551685 2064791 cri.go:92] found id: ""
	I1219 06:15:31.551699 2064791 logs.go:282] 0 containers: []
	W1219 06:15:31.551706 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:31.551711 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:31.551772 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:31.578616 2064791 cri.go:92] found id: ""
	I1219 06:15:31.578631 2064791 logs.go:282] 0 containers: []
	W1219 06:15:31.578637 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:31.578643 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:31.578703 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:31.602562 2064791 cri.go:92] found id: ""
	I1219 06:15:31.602576 2064791 logs.go:282] 0 containers: []
	W1219 06:15:31.602582 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:31.602588 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:31.602646 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:31.626697 2064791 cri.go:92] found id: ""
	I1219 06:15:31.626711 2064791 logs.go:282] 0 containers: []
	W1219 06:15:31.626718 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:31.626723 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:31.626786 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:31.650705 2064791 cri.go:92] found id: ""
	I1219 06:15:31.650718 2064791 logs.go:282] 0 containers: []
	W1219 06:15:31.650725 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:31.650730 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:31.650791 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:31.675292 2064791 cri.go:92] found id: ""
	I1219 06:15:31.675305 2064791 logs.go:282] 0 containers: []
	W1219 06:15:31.675312 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:31.675318 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:31.675390 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:31.699969 2064791 cri.go:92] found id: ""
	I1219 06:15:31.699993 2064791 logs.go:282] 0 containers: []
	W1219 06:15:31.700000 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:31.700008 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:31.700018 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:31.765728 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:31.765750 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:31.793450 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:31.793466 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:31.849244 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:31.849262 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:31.866467 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:31.866483 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:31.960156 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:31.933314   15472 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:31.948921   15472 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:31.949480   15472 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:31.951697   15472 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:31.951968   15472 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:31.933314   15472 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:31.948921   15472 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:31.949480   15472 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:31.951697   15472 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:31.951968   15472 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:34.460923 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:34.473072 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:34.473134 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:34.498075 2064791 cri.go:92] found id: ""
	I1219 06:15:34.498089 2064791 logs.go:282] 0 containers: []
	W1219 06:15:34.498097 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:34.498103 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:34.498162 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:34.522785 2064791 cri.go:92] found id: ""
	I1219 06:15:34.522800 2064791 logs.go:282] 0 containers: []
	W1219 06:15:34.522807 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:34.522812 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:34.522871 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:34.550566 2064791 cri.go:92] found id: ""
	I1219 06:15:34.550580 2064791 logs.go:282] 0 containers: []
	W1219 06:15:34.550587 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:34.550592 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:34.550651 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:34.579586 2064791 cri.go:92] found id: ""
	I1219 06:15:34.579600 2064791 logs.go:282] 0 containers: []
	W1219 06:15:34.579607 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:34.579612 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:34.579670 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:34.606248 2064791 cri.go:92] found id: ""
	I1219 06:15:34.606261 2064791 logs.go:282] 0 containers: []
	W1219 06:15:34.606269 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:34.606274 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:34.606335 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:34.634419 2064791 cri.go:92] found id: ""
	I1219 06:15:34.634433 2064791 logs.go:282] 0 containers: []
	W1219 06:15:34.634440 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:34.634446 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:34.634509 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:34.658438 2064791 cri.go:92] found id: ""
	I1219 06:15:34.658451 2064791 logs.go:282] 0 containers: []
	W1219 06:15:34.658458 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:34.658465 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:34.658475 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:34.675933 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:34.675950 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:34.740273 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:34.732883   15564 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:34.733297   15564 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:34.734737   15564 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:34.735051   15564 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:34.736480   15564 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:34.732883   15564 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:34.733297   15564 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:34.734737   15564 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:34.735051   15564 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:34.736480   15564 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:34.740283 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:34.740293 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:34.802357 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:34.802378 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:34.833735 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:34.833751 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:37.390170 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:37.400300 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:37.400358 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:37.425095 2064791 cri.go:92] found id: ""
	I1219 06:15:37.425110 2064791 logs.go:282] 0 containers: []
	W1219 06:15:37.425117 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:37.425122 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:37.425178 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:37.451178 2064791 cri.go:92] found id: ""
	I1219 06:15:37.451192 2064791 logs.go:282] 0 containers: []
	W1219 06:15:37.451199 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:37.451205 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:37.451273 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:37.475828 2064791 cri.go:92] found id: ""
	I1219 06:15:37.475842 2064791 logs.go:282] 0 containers: []
	W1219 06:15:37.475848 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:37.475854 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:37.475911 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:37.499474 2064791 cri.go:92] found id: ""
	I1219 06:15:37.499488 2064791 logs.go:282] 0 containers: []
	W1219 06:15:37.499494 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:37.499500 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:37.499563 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:37.523636 2064791 cri.go:92] found id: ""
	I1219 06:15:37.523649 2064791 logs.go:282] 0 containers: []
	W1219 06:15:37.523656 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:37.523662 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:37.523720 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:37.547846 2064791 cri.go:92] found id: ""
	I1219 06:15:37.547859 2064791 logs.go:282] 0 containers: []
	W1219 06:15:37.547868 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:37.547873 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:37.547929 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:37.574766 2064791 cri.go:92] found id: ""
	I1219 06:15:37.574780 2064791 logs.go:282] 0 containers: []
	W1219 06:15:37.574787 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:37.574795 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:37.574805 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:37.601905 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:37.601923 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:37.657564 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:37.657584 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:37.674777 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:37.674793 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:37.736918 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:37.728853   15679 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:37.729594   15679 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:37.731160   15679 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:37.731519   15679 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:37.733094   15679 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:37.728853   15679 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:37.729594   15679 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:37.731160   15679 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:37.731519   15679 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:37.733094   15679 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:37.736928 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:37.736939 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:40.303769 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:40.313854 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:40.313919 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:40.338505 2064791 cri.go:92] found id: ""
	I1219 06:15:40.338519 2064791 logs.go:282] 0 containers: []
	W1219 06:15:40.338527 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:40.338532 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:40.338594 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:40.363391 2064791 cri.go:92] found id: ""
	I1219 06:15:40.363405 2064791 logs.go:282] 0 containers: []
	W1219 06:15:40.363412 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:40.363417 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:40.363476 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:40.389092 2064791 cri.go:92] found id: ""
	I1219 06:15:40.389105 2064791 logs.go:282] 0 containers: []
	W1219 06:15:40.389113 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:40.389118 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:40.389184 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:40.412993 2064791 cri.go:92] found id: ""
	I1219 06:15:40.413007 2064791 logs.go:282] 0 containers: []
	W1219 06:15:40.413014 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:40.413022 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:40.413087 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:40.438530 2064791 cri.go:92] found id: ""
	I1219 06:15:40.438544 2064791 logs.go:282] 0 containers: []
	W1219 06:15:40.438550 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:40.438556 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:40.438617 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:40.462221 2064791 cri.go:92] found id: ""
	I1219 06:15:40.462235 2064791 logs.go:282] 0 containers: []
	W1219 06:15:40.462242 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:40.462248 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:40.462310 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:40.487125 2064791 cri.go:92] found id: ""
	I1219 06:15:40.487139 2064791 logs.go:282] 0 containers: []
	W1219 06:15:40.487146 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:40.487155 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:40.487165 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:40.543163 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:40.543184 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:40.560362 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:40.560379 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:40.627130 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:40.619309   15773 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:40.619937   15773 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:40.621423   15773 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:40.621900   15773 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:40.623348   15773 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:40.619309   15773 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:40.619937   15773 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:40.621423   15773 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:40.621900   15773 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:40.623348   15773 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:40.627139 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:40.627149 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:40.689654 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:40.689673 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:43.219338 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:43.229544 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:43.229607 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:43.253914 2064791 cri.go:92] found id: ""
	I1219 06:15:43.253935 2064791 logs.go:282] 0 containers: []
	W1219 06:15:43.253941 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:43.253947 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:43.254007 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:43.279019 2064791 cri.go:92] found id: ""
	I1219 06:15:43.279033 2064791 logs.go:282] 0 containers: []
	W1219 06:15:43.279040 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:43.279045 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:43.279106 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:43.304187 2064791 cri.go:92] found id: ""
	I1219 06:15:43.304202 2064791 logs.go:282] 0 containers: []
	W1219 06:15:43.304209 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:43.304216 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:43.304275 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:43.327938 2064791 cri.go:92] found id: ""
	I1219 06:15:43.327951 2064791 logs.go:282] 0 containers: []
	W1219 06:15:43.327958 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:43.327963 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:43.328027 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:43.356864 2064791 cri.go:92] found id: ""
	I1219 06:15:43.356878 2064791 logs.go:282] 0 containers: []
	W1219 06:15:43.356885 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:43.356891 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:43.356958 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:43.381050 2064791 cri.go:92] found id: ""
	I1219 06:15:43.381063 2064791 logs.go:282] 0 containers: []
	W1219 06:15:43.381070 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:43.381076 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:43.381138 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:43.404804 2064791 cri.go:92] found id: ""
	I1219 06:15:43.404818 2064791 logs.go:282] 0 containers: []
	W1219 06:15:43.404825 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:43.404832 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:43.404857 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:43.470026 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:43.461361   15871 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:43.461922   15871 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:43.463417   15871 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:43.464514   15871 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:43.465204   15871 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:43.461361   15871 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:43.461922   15871 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:43.463417   15871 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:43.464514   15871 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:43.465204   15871 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:43.470036 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:43.470050 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:43.533067 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:43.533086 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:43.560074 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:43.560097 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:43.618564 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:43.618582 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:46.135866 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:46.146429 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:46.146493 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:46.180564 2064791 cri.go:92] found id: ""
	I1219 06:15:46.180578 2064791 logs.go:282] 0 containers: []
	W1219 06:15:46.180595 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:46.180601 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:46.180669 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:46.208067 2064791 cri.go:92] found id: ""
	I1219 06:15:46.208081 2064791 logs.go:282] 0 containers: []
	W1219 06:15:46.208087 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:46.208100 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:46.208159 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:46.234676 2064791 cri.go:92] found id: ""
	I1219 06:15:46.234692 2064791 logs.go:282] 0 containers: []
	W1219 06:15:46.234703 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:46.234709 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:46.234775 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:46.259673 2064791 cri.go:92] found id: ""
	I1219 06:15:46.259686 2064791 logs.go:282] 0 containers: []
	W1219 06:15:46.259693 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:46.259707 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:46.259765 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:46.286964 2064791 cri.go:92] found id: ""
	I1219 06:15:46.286979 2064791 logs.go:282] 0 containers: []
	W1219 06:15:46.286986 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:46.286992 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:46.287056 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:46.312785 2064791 cri.go:92] found id: ""
	I1219 06:15:46.312800 2064791 logs.go:282] 0 containers: []
	W1219 06:15:46.312807 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:46.312813 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:46.312875 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:46.339250 2064791 cri.go:92] found id: ""
	I1219 06:15:46.339264 2064791 logs.go:282] 0 containers: []
	W1219 06:15:46.339271 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:46.339279 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:46.339290 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:46.368113 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:46.368129 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:46.423008 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:46.423029 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:46.440481 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:46.440503 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:46.504270 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:46.496670   15994 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:46.497191   15994 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:46.498736   15994 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:46.499181   15994 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:46.500593   15994 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:46.496670   15994 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:46.497191   15994 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:46.498736   15994 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:46.499181   15994 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:46.500593   15994 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:46.504280 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:46.504291 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:49.065736 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:49.075993 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:49.076057 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:49.102714 2064791 cri.go:92] found id: ""
	I1219 06:15:49.102729 2064791 logs.go:282] 0 containers: []
	W1219 06:15:49.102736 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:49.102741 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:49.102808 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:49.131284 2064791 cri.go:92] found id: ""
	I1219 06:15:49.131297 2064791 logs.go:282] 0 containers: []
	W1219 06:15:49.131323 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:49.131328 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:49.131398 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:49.166942 2064791 cri.go:92] found id: ""
	I1219 06:15:49.166955 2064791 logs.go:282] 0 containers: []
	W1219 06:15:49.166962 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:49.166968 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:49.167036 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:49.204412 2064791 cri.go:92] found id: ""
	I1219 06:15:49.204425 2064791 logs.go:282] 0 containers: []
	W1219 06:15:49.204444 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:49.204450 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:49.204522 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:49.232351 2064791 cri.go:92] found id: ""
	I1219 06:15:49.232364 2064791 logs.go:282] 0 containers: []
	W1219 06:15:49.232371 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:49.232377 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:49.232434 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:49.257013 2064791 cri.go:92] found id: ""
	I1219 06:15:49.257028 2064791 logs.go:282] 0 containers: []
	W1219 06:15:49.257046 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:49.257052 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:49.257112 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:49.282354 2064791 cri.go:92] found id: ""
	I1219 06:15:49.282368 2064791 logs.go:282] 0 containers: []
	W1219 06:15:49.282375 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:49.282384 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:49.282396 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:49.351742 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:49.342325   16079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:49.343272   16079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:49.345051   16079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:49.345596   16079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:49.347231   16079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:49.342325   16079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:49.343272   16079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:49.345051   16079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:49.345596   16079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:49.347231   16079 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:49.351753 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:49.351764 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:49.416971 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:49.416991 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:49.445804 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:49.445819 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:49.503988 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:49.504006 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:52.023309 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:52.034750 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:52.034819 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:52.061000 2064791 cri.go:92] found id: ""
	I1219 06:15:52.061014 2064791 logs.go:282] 0 containers: []
	W1219 06:15:52.061021 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:52.061026 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:52.061084 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:52.086949 2064791 cri.go:92] found id: ""
	I1219 06:15:52.086964 2064791 logs.go:282] 0 containers: []
	W1219 06:15:52.086971 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:52.086977 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:52.087048 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:52.112534 2064791 cri.go:92] found id: ""
	I1219 06:15:52.112549 2064791 logs.go:282] 0 containers: []
	W1219 06:15:52.112556 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:52.112562 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:52.112635 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:52.137132 2064791 cri.go:92] found id: ""
	I1219 06:15:52.137146 2064791 logs.go:282] 0 containers: []
	W1219 06:15:52.137154 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:52.137160 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:52.137221 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:52.191157 2064791 cri.go:92] found id: ""
	I1219 06:15:52.191171 2064791 logs.go:282] 0 containers: []
	W1219 06:15:52.191178 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:52.191184 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:52.191245 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:52.220921 2064791 cri.go:92] found id: ""
	I1219 06:15:52.220936 2064791 logs.go:282] 0 containers: []
	W1219 06:15:52.220942 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:52.220948 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:52.221009 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:52.250645 2064791 cri.go:92] found id: ""
	I1219 06:15:52.250658 2064791 logs.go:282] 0 containers: []
	W1219 06:15:52.250665 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:52.250673 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:52.250684 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:52.306199 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:52.306222 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:52.323553 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:52.323570 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:52.386634 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:52.378311   16188 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:52.379023   16188 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:52.380829   16188 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:52.381330   16188 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:52.382864   16188 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:52.378311   16188 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:52.379023   16188 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:52.380829   16188 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:52.381330   16188 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:52.382864   16188 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:52.386643 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:52.386653 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:52.450135 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:52.450155 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:54.981347 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:54.991806 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:54.991864 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:55.028687 2064791 cri.go:92] found id: ""
	I1219 06:15:55.028702 2064791 logs.go:282] 0 containers: []
	W1219 06:15:55.028709 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:55.028714 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:55.028797 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:55.053716 2064791 cri.go:92] found id: ""
	I1219 06:15:55.053730 2064791 logs.go:282] 0 containers: []
	W1219 06:15:55.053737 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:55.053784 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:55.053857 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:55.080935 2064791 cri.go:92] found id: ""
	I1219 06:15:55.080949 2064791 logs.go:282] 0 containers: []
	W1219 06:15:55.080957 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:55.080962 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:55.081027 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:55.109910 2064791 cri.go:92] found id: ""
	I1219 06:15:55.109925 2064791 logs.go:282] 0 containers: []
	W1219 06:15:55.109932 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:55.109938 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:55.110005 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:55.138372 2064791 cri.go:92] found id: ""
	I1219 06:15:55.138386 2064791 logs.go:282] 0 containers: []
	W1219 06:15:55.138393 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:55.138400 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:55.138463 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:55.172107 2064791 cri.go:92] found id: ""
	I1219 06:15:55.172121 2064791 logs.go:282] 0 containers: []
	W1219 06:15:55.172128 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:55.172133 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:55.172191 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:55.207670 2064791 cri.go:92] found id: ""
	I1219 06:15:55.207684 2064791 logs.go:282] 0 containers: []
	W1219 06:15:55.207690 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:55.207698 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:55.207708 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:15:55.273955 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:55.273975 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:55.303942 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:55.303960 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:55.367492 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:55.367517 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:55.384909 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:55.384933 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:55.447954 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:55.439722   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:55.440407   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:55.442003   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:55.442525   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:55.444025   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:55.439722   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:55.440407   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:55.442003   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:55.442525   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:55.444025   16305 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:57.948746 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:15:57.959024 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:15:57.959084 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:15:57.984251 2064791 cri.go:92] found id: ""
	I1219 06:15:57.984264 2064791 logs.go:282] 0 containers: []
	W1219 06:15:57.984271 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:15:57.984277 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:15:57.984335 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:15:58.012444 2064791 cri.go:92] found id: ""
	I1219 06:15:58.012459 2064791 logs.go:282] 0 containers: []
	W1219 06:15:58.012467 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:15:58.012472 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:15:58.012531 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:15:58.040674 2064791 cri.go:92] found id: ""
	I1219 06:15:58.040688 2064791 logs.go:282] 0 containers: []
	W1219 06:15:58.040695 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:15:58.040700 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:15:58.040783 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:15:58.066507 2064791 cri.go:92] found id: ""
	I1219 06:15:58.066522 2064791 logs.go:282] 0 containers: []
	W1219 06:15:58.066529 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:15:58.066535 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:15:58.066598 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:15:58.095594 2064791 cri.go:92] found id: ""
	I1219 06:15:58.095608 2064791 logs.go:282] 0 containers: []
	W1219 06:15:58.095615 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:15:58.095620 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:15:58.095680 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:15:58.121624 2064791 cri.go:92] found id: ""
	I1219 06:15:58.121638 2064791 logs.go:282] 0 containers: []
	W1219 06:15:58.121644 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:15:58.121650 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:15:58.121707 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:15:58.149741 2064791 cri.go:92] found id: ""
	I1219 06:15:58.149755 2064791 logs.go:282] 0 containers: []
	W1219 06:15:58.149762 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:15:58.149770 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:15:58.149782 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:15:58.181272 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:15:58.181288 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:15:58.240957 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:15:58.240987 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:15:58.258044 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:15:58.258060 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:15:58.322228 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:15:58.314484   16407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:58.315163   16407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:58.316700   16407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:58.317166   16407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:58.318619   16407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:15:58.314484   16407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:58.315163   16407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:58.316700   16407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:58.317166   16407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:15:58.318619   16407 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:15:58.322239 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:15:58.322250 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:16:00.885057 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:16:00.895320 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:16:00.895386 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:16:00.919880 2064791 cri.go:92] found id: ""
	I1219 06:16:00.919914 2064791 logs.go:282] 0 containers: []
	W1219 06:16:00.919922 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:16:00.919927 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:16:00.919995 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:16:00.944225 2064791 cri.go:92] found id: ""
	I1219 06:16:00.944238 2064791 logs.go:282] 0 containers: []
	W1219 06:16:00.944245 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:16:00.944250 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:16:00.944316 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:16:00.969895 2064791 cri.go:92] found id: ""
	I1219 06:16:00.969909 2064791 logs.go:282] 0 containers: []
	W1219 06:16:00.969916 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:16:00.969921 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:16:00.969982 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:16:00.994103 2064791 cri.go:92] found id: ""
	I1219 06:16:00.994118 2064791 logs.go:282] 0 containers: []
	W1219 06:16:00.994134 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:16:00.994141 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:16:00.994224 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:16:01.021151 2064791 cri.go:92] found id: ""
	I1219 06:16:01.021166 2064791 logs.go:282] 0 containers: []
	W1219 06:16:01.021172 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:16:01.021181 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:16:01.021244 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:16:01.046747 2064791 cri.go:92] found id: ""
	I1219 06:16:01.046761 2064791 logs.go:282] 0 containers: []
	W1219 06:16:01.046768 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:16:01.046773 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:16:01.046831 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:16:01.071655 2064791 cri.go:92] found id: ""
	I1219 06:16:01.071672 2064791 logs.go:282] 0 containers: []
	W1219 06:16:01.071679 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:16:01.071686 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:16:01.071696 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:16:01.127618 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:16:01.127636 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:16:01.145631 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:16:01.145650 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:16:01.235681 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:16:01.226719   16501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:01.227442   16501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:01.229160   16501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:01.229807   16501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:01.231513   16501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:16:01.226719   16501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:01.227442   16501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:01.229160   16501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:01.229807   16501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:01.231513   16501 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:16:01.235691 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:16:01.235703 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:16:01.299234 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:16:01.299254 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:16:03.829050 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:16:03.839364 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:16:03.839436 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:16:03.871023 2064791 cri.go:92] found id: ""
	I1219 06:16:03.871037 2064791 logs.go:282] 0 containers: []
	W1219 06:16:03.871044 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:16:03.871049 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:16:03.871107 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:16:03.895774 2064791 cri.go:92] found id: ""
	I1219 06:16:03.895788 2064791 logs.go:282] 0 containers: []
	W1219 06:16:03.895795 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:16:03.895800 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:16:03.895859 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:16:03.921890 2064791 cri.go:92] found id: ""
	I1219 06:16:03.921904 2064791 logs.go:282] 0 containers: []
	W1219 06:16:03.921911 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:16:03.921916 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:16:03.921978 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:16:03.946705 2064791 cri.go:92] found id: ""
	I1219 06:16:03.946719 2064791 logs.go:282] 0 containers: []
	W1219 06:16:03.946726 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:16:03.946731 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:16:03.946790 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:16:03.972566 2064791 cri.go:92] found id: ""
	I1219 06:16:03.972579 2064791 logs.go:282] 0 containers: []
	W1219 06:16:03.972605 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:16:03.972610 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:16:03.972676 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:16:03.998217 2064791 cri.go:92] found id: ""
	I1219 06:16:03.998232 2064791 logs.go:282] 0 containers: []
	W1219 06:16:03.998239 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:16:03.998245 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:16:03.998311 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:16:04.024748 2064791 cri.go:92] found id: ""
	I1219 06:16:04.024786 2064791 logs.go:282] 0 containers: []
	W1219 06:16:04.024793 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:16:04.024802 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:16:04.024827 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:16:04.089385 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:16:04.089406 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:16:04.120677 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:16:04.120695 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:16:04.178263 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:16:04.178282 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:16:04.201672 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:16:04.201688 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:16:04.272543 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:16:04.263798   16619 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:04.264930   16619 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:04.265587   16619 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:04.267210   16619 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:04.267480   16619 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:16:04.263798   16619 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:04.264930   16619 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:04.265587   16619 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:04.267210   16619 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:04.267480   16619 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:16:06.772819 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:16:06.784042 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:16:06.784119 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:16:06.809087 2064791 cri.go:92] found id: ""
	I1219 06:16:06.809101 2064791 logs.go:282] 0 containers: []
	W1219 06:16:06.809108 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:16:06.809113 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:16:06.809171 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:16:06.833636 2064791 cri.go:92] found id: ""
	I1219 06:16:06.833649 2064791 logs.go:282] 0 containers: []
	W1219 06:16:06.833656 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:16:06.833661 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:16:06.833726 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:16:06.862766 2064791 cri.go:92] found id: ""
	I1219 06:16:06.862781 2064791 logs.go:282] 0 containers: []
	W1219 06:16:06.862788 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:16:06.862797 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:16:06.862858 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:16:06.887915 2064791 cri.go:92] found id: ""
	I1219 06:16:06.887929 2064791 logs.go:282] 0 containers: []
	W1219 06:16:06.887935 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:16:06.887940 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:16:06.888001 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:16:06.913093 2064791 cri.go:92] found id: ""
	I1219 06:16:06.913107 2064791 logs.go:282] 0 containers: []
	W1219 06:16:06.913114 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:16:06.913119 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:16:06.913184 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:16:06.944662 2064791 cri.go:92] found id: ""
	I1219 06:16:06.944677 2064791 logs.go:282] 0 containers: []
	W1219 06:16:06.944695 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:16:06.944700 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:16:06.944796 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:16:06.976908 2064791 cri.go:92] found id: ""
	I1219 06:16:06.976923 2064791 logs.go:282] 0 containers: []
	W1219 06:16:06.976929 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:16:06.976937 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:16:06.976948 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:16:07.041844 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:16:07.041865 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:16:07.071749 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:16:07.071765 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:16:07.130039 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:16:07.130060 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:16:07.147749 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:16:07.147766 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:16:07.226540 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:16:07.218267   16726 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:07.218857   16726 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:07.220373   16726 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:07.220937   16726 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:07.222460   16726 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:16:07.218267   16726 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:07.218857   16726 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:07.220373   16726 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:07.220937   16726 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:07.222460   16726 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:16:09.726802 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:16:09.737347 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:16:09.737408 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:16:09.761740 2064791 cri.go:92] found id: ""
	I1219 06:16:09.761754 2064791 logs.go:282] 0 containers: []
	W1219 06:16:09.761761 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:16:09.761767 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:16:09.761838 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:16:09.787861 2064791 cri.go:92] found id: ""
	I1219 06:16:09.787876 2064791 logs.go:282] 0 containers: []
	W1219 06:16:09.787883 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:16:09.787888 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:16:09.787950 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:16:09.812599 2064791 cri.go:92] found id: ""
	I1219 06:16:09.812613 2064791 logs.go:282] 0 containers: []
	W1219 06:16:09.812620 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:16:09.812625 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:16:09.812687 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:16:09.837573 2064791 cri.go:92] found id: ""
	I1219 06:16:09.837588 2064791 logs.go:282] 0 containers: []
	W1219 06:16:09.837596 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:16:09.837601 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:16:09.837661 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:16:09.861697 2064791 cri.go:92] found id: ""
	I1219 06:16:09.861712 2064791 logs.go:282] 0 containers: []
	W1219 06:16:09.861718 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:16:09.861723 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:16:09.861788 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:16:09.886842 2064791 cri.go:92] found id: ""
	I1219 06:16:09.886856 2064791 logs.go:282] 0 containers: []
	W1219 06:16:09.886872 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:16:09.886884 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:16:09.886956 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:16:09.912372 2064791 cri.go:92] found id: ""
	I1219 06:16:09.912387 2064791 logs.go:282] 0 containers: []
	W1219 06:16:09.912395 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:16:09.912403 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:16:09.912413 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:16:09.971481 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:16:09.971501 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:16:09.989303 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:16:09.989320 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:16:10.067493 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:16:10.058017   16811 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:10.059169   16811 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:10.059962   16811 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:10.061845   16811 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:10.062203   16811 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:16:10.058017   16811 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:10.059169   16811 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:10.059962   16811 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:10.061845   16811 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:10.062203   16811 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:16:10.067504 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:16:10.067517 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:16:10.132042 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:16:10.132062 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:16:12.664804 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:16:12.675466 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:16:12.675550 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:16:12.704963 2064791 cri.go:92] found id: ""
	I1219 06:16:12.704978 2064791 logs.go:282] 0 containers: []
	W1219 06:16:12.704985 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:16:12.704990 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:16:12.705052 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:16:12.730087 2064791 cri.go:92] found id: ""
	I1219 06:16:12.730103 2064791 logs.go:282] 0 containers: []
	W1219 06:16:12.730110 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:16:12.730115 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:16:12.730178 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:16:12.760566 2064791 cri.go:92] found id: ""
	I1219 06:16:12.760595 2064791 logs.go:282] 0 containers: []
	W1219 06:16:12.760602 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:16:12.760608 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:16:12.760675 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:16:12.785694 2064791 cri.go:92] found id: ""
	I1219 06:16:12.785707 2064791 logs.go:282] 0 containers: []
	W1219 06:16:12.785714 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:16:12.785719 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:16:12.785781 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:16:12.811923 2064791 cri.go:92] found id: ""
	I1219 06:16:12.811938 2064791 logs.go:282] 0 containers: []
	W1219 06:16:12.811956 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:16:12.811962 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:16:12.812036 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:16:12.838424 2064791 cri.go:92] found id: ""
	I1219 06:16:12.838438 2064791 logs.go:282] 0 containers: []
	W1219 06:16:12.838445 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:16:12.838451 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:16:12.838514 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:16:12.864177 2064791 cri.go:92] found id: ""
	I1219 06:16:12.864191 2064791 logs.go:282] 0 containers: []
	W1219 06:16:12.864198 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:16:12.864206 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:16:12.864216 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:16:12.920882 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:16:12.920904 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:16:12.937942 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:16:12.937959 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:16:13.004209 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:16:12.994302   16911 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:12.994966   16911 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:12.996691   16911 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:12.997250   16911 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:12.998931   16911 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:16:12.994302   16911 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:12.994966   16911 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:12.996691   16911 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:12.997250   16911 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:12.998931   16911 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:16:13.004223 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:16:13.004247 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:16:13.067051 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:16:13.067071 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:16:15.596451 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:16:15.606953 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:16:15.607013 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:16:15.639546 2064791 cri.go:92] found id: ""
	I1219 06:16:15.639560 2064791 logs.go:282] 0 containers: []
	W1219 06:16:15.639569 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:16:15.639574 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:16:15.639637 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:16:15.667230 2064791 cri.go:92] found id: ""
	I1219 06:16:15.667245 2064791 logs.go:282] 0 containers: []
	W1219 06:16:15.667252 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:16:15.667257 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:16:15.667321 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:16:15.693059 2064791 cri.go:92] found id: ""
	I1219 06:16:15.693073 2064791 logs.go:282] 0 containers: []
	W1219 06:16:15.693080 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:16:15.693086 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:16:15.693145 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:16:15.718341 2064791 cri.go:92] found id: ""
	I1219 06:16:15.718356 2064791 logs.go:282] 0 containers: []
	W1219 06:16:15.718363 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:16:15.718368 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:16:15.718437 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:16:15.744544 2064791 cri.go:92] found id: ""
	I1219 06:16:15.744559 2064791 logs.go:282] 0 containers: []
	W1219 06:16:15.744566 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:16:15.744571 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:16:15.744632 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:16:15.769809 2064791 cri.go:92] found id: ""
	I1219 06:16:15.769823 2064791 logs.go:282] 0 containers: []
	W1219 06:16:15.769830 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:16:15.769836 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:16:15.769897 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:16:15.793872 2064791 cri.go:92] found id: ""
	I1219 06:16:15.793887 2064791 logs.go:282] 0 containers: []
	W1219 06:16:15.793894 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:16:15.793902 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:16:15.793914 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:16:15.811209 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:16:15.811228 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:16:15.875495 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:16:15.867475   17013 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:15.868031   17013 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:15.869581   17013 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:15.870044   17013 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:15.871521   17013 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:16:15.867475   17013 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:15.868031   17013 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:15.869581   17013 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:15.870044   17013 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:15.871521   17013 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:16:15.875504 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:16:15.875516 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:16:15.938869 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:16:15.938889 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:16:15.967183 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:16:15.967200 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:16:18.524056 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:16:18.534213 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:16:18.534283 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:16:18.558904 2064791 cri.go:92] found id: ""
	I1219 06:16:18.558918 2064791 logs.go:282] 0 containers: []
	W1219 06:16:18.558924 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:16:18.558929 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:16:18.558994 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:16:18.583638 2064791 cri.go:92] found id: ""
	I1219 06:16:18.583653 2064791 logs.go:282] 0 containers: []
	W1219 06:16:18.583661 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:16:18.583666 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:16:18.583726 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:16:18.611047 2064791 cri.go:92] found id: ""
	I1219 06:16:18.611061 2064791 logs.go:282] 0 containers: []
	W1219 06:16:18.611068 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:16:18.611073 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:16:18.611133 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:16:18.635234 2064791 cri.go:92] found id: ""
	I1219 06:16:18.635248 2064791 logs.go:282] 0 containers: []
	W1219 06:16:18.635255 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:16:18.635261 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:16:18.635322 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:16:18.658732 2064791 cri.go:92] found id: ""
	I1219 06:16:18.658747 2064791 logs.go:282] 0 containers: []
	W1219 06:16:18.658754 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:16:18.658759 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:16:18.658819 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:16:18.687782 2064791 cri.go:92] found id: ""
	I1219 06:16:18.687796 2064791 logs.go:282] 0 containers: []
	W1219 06:16:18.687803 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:16:18.687808 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:16:18.687871 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:16:18.713641 2064791 cri.go:92] found id: ""
	I1219 06:16:18.713655 2064791 logs.go:282] 0 containers: []
	W1219 06:16:18.713662 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:16:18.713670 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:16:18.713687 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:16:18.730768 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:16:18.730786 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:16:18.797385 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:16:18.788999   17117 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:18.789629   17117 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:18.791299   17117 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:18.791871   17117 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:18.793492   17117 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:16:18.788999   17117 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:18.789629   17117 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:18.791299   17117 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:18.791871   17117 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:16:18.793492   17117 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:16:18.797396 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:16:18.797406 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:16:18.861009 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:16:18.861029 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:16:18.889085 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:16:18.889102 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:16:21.448880 2064791 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:16:21.458996 2064791 kubeadm.go:602] duration metric: took 4m4.592886052s to restartPrimaryControlPlane
	W1219 06:16:21.459078 2064791 out.go:285] ! Unable to restart control-plane node(s), will reset cluster: <no value>
	I1219 06:16:21.459152 2064791 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1219 06:16:21.873036 2064791 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1219 06:16:21.887075 2064791 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1219 06:16:21.894868 2064791 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1219 06:16:21.894925 2064791 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1219 06:16:21.902909 2064791 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1219 06:16:21.902919 2064791 kubeadm.go:158] found existing configuration files:
	
	I1219 06:16:21.902973 2064791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1219 06:16:21.912282 2064791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1219 06:16:21.912342 2064791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1219 06:16:21.920310 2064791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1219 06:16:21.928090 2064791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1219 06:16:21.928158 2064791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1219 06:16:21.935829 2064791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1219 06:16:21.944085 2064791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1219 06:16:21.944143 2064791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1219 06:16:21.951866 2064791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1219 06:16:21.959883 2064791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1219 06:16:21.959950 2064791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1219 06:16:21.967628 2064791 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1219 06:16:22.006002 2064791 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-rc.1
	I1219 06:16:22.006076 2064791 kubeadm.go:319] [preflight] Running pre-flight checks
	I1219 06:16:22.084826 2064791 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1219 06:16:22.084890 2064791 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1219 06:16:22.084925 2064791 kubeadm.go:319] OS: Linux
	I1219 06:16:22.084969 2064791 kubeadm.go:319] CGROUPS_CPU: enabled
	I1219 06:16:22.085017 2064791 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1219 06:16:22.085068 2064791 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1219 06:16:22.085115 2064791 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1219 06:16:22.085163 2064791 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1219 06:16:22.085209 2064791 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1219 06:16:22.085254 2064791 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1219 06:16:22.085302 2064791 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1219 06:16:22.085348 2064791 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1219 06:16:22.154531 2064791 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1219 06:16:22.154670 2064791 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1219 06:16:22.154781 2064791 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1219 06:16:22.163477 2064791 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1219 06:16:22.169007 2064791 out.go:252]   - Generating certificates and keys ...
	I1219 06:16:22.169099 2064791 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1219 06:16:22.169162 2064791 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1219 06:16:22.169237 2064791 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1219 06:16:22.169297 2064791 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1219 06:16:22.169372 2064791 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1219 06:16:22.169426 2064791 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1219 06:16:22.169488 2064791 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1219 06:16:22.169549 2064791 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1219 06:16:22.169633 2064791 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1219 06:16:22.169704 2064791 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1219 06:16:22.169741 2064791 kubeadm.go:319] [certs] Using the existing "sa" key
	I1219 06:16:22.169795 2064791 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1219 06:16:22.320644 2064791 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1219 06:16:22.743805 2064791 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1219 06:16:22.867878 2064791 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1219 06:16:22.974729 2064791 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1219 06:16:23.395365 2064791 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1219 06:16:23.396030 2064791 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1219 06:16:23.399355 2064791 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1219 06:16:23.402464 2064791 out.go:252]   - Booting up control plane ...
	I1219 06:16:23.402561 2064791 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1219 06:16:23.402637 2064791 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1219 06:16:23.403521 2064791 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1219 06:16:23.423590 2064791 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1219 06:16:23.423990 2064791 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1219 06:16:23.431661 2064791 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1219 06:16:23.431897 2064791 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1219 06:16:23.432074 2064791 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1219 06:16:23.567443 2064791 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1219 06:16:23.567557 2064791 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1219 06:20:23.567966 2064791 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.000561406s
	I1219 06:20:23.567991 2064791 kubeadm.go:319] 
	I1219 06:20:23.568084 2064791 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1219 06:20:23.568128 2064791 kubeadm.go:319] 	- The kubelet is not running
	I1219 06:20:23.568239 2064791 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1219 06:20:23.568244 2064791 kubeadm.go:319] 
	I1219 06:20:23.568354 2064791 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1219 06:20:23.568390 2064791 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1219 06:20:23.568420 2064791 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1219 06:20:23.568423 2064791 kubeadm.go:319] 
	I1219 06:20:23.572732 2064791 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1219 06:20:23.573205 2064791 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1219 06:20:23.573348 2064791 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1219 06:20:23.573651 2064791 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	I1219 06:20:23.573656 2064791 kubeadm.go:319] 
	W1219 06:20:23.573846 2064791 out.go:285] ! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000561406s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	
	To see the stack trace of this error execute with --v=5 or higher
	
	I1219 06:20:23.573948 2064791 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1219 06:20:23.574218 2064791 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1219 06:20:23.984042 2064791 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1219 06:20:23.997740 2064791 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1219 06:20:23.997798 2064791 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1219 06:20:24.008638 2064791 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1219 06:20:24.008649 2064791 kubeadm.go:158] found existing configuration files:
	
	I1219 06:20:24.008724 2064791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1219 06:20:24.018051 2064791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1219 06:20:24.018112 2064791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1219 06:20:24.026089 2064791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1219 06:20:24.034468 2064791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1219 06:20:24.034524 2064791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1219 06:20:24.042330 2064791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1219 06:20:24.050325 2064791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1219 06:20:24.050390 2064791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1219 06:20:24.058263 2064791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1219 06:20:24.066872 2064791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1219 06:20:24.066933 2064791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1219 06:20:24.075206 2064791 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1219 06:20:24.113532 2064791 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-rc.1
	I1219 06:20:24.113595 2064791 kubeadm.go:319] [preflight] Running pre-flight checks
	I1219 06:20:24.190273 2064791 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1219 06:20:24.190347 2064791 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1219 06:20:24.190399 2064791 kubeadm.go:319] OS: Linux
	I1219 06:20:24.190447 2064791 kubeadm.go:319] CGROUPS_CPU: enabled
	I1219 06:20:24.190497 2064791 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1219 06:20:24.190547 2064791 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1219 06:20:24.190597 2064791 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1219 06:20:24.190648 2064791 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1219 06:20:24.190697 2064791 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1219 06:20:24.190745 2064791 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1219 06:20:24.190796 2064791 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1219 06:20:24.190844 2064791 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1219 06:20:24.261095 2064791 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1219 06:20:24.261198 2064791 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1219 06:20:24.261287 2064791 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1219 06:20:24.273343 2064791 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1219 06:20:24.278556 2064791 out.go:252]   - Generating certificates and keys ...
	I1219 06:20:24.278645 2064791 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1219 06:20:24.278707 2064791 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1219 06:20:24.278781 2064791 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1219 06:20:24.278840 2064791 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1219 06:20:24.278908 2064791 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1219 06:20:24.278961 2064791 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1219 06:20:24.279023 2064791 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1219 06:20:24.279082 2064791 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1219 06:20:24.279155 2064791 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1219 06:20:24.279227 2064791 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1219 06:20:24.279263 2064791 kubeadm.go:319] [certs] Using the existing "sa" key
	I1219 06:20:24.279319 2064791 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1219 06:20:24.586742 2064791 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1219 06:20:24.705000 2064791 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1219 06:20:25.117117 2064791 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1219 06:20:25.207046 2064791 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1219 06:20:25.407003 2064791 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1219 06:20:25.408181 2064791 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1219 06:20:25.412332 2064791 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1219 06:20:25.415422 2064791 out.go:252]   - Booting up control plane ...
	I1219 06:20:25.415519 2064791 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1219 06:20:25.415596 2064791 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1219 06:20:25.415664 2064791 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1219 06:20:25.435196 2064791 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1219 06:20:25.435555 2064791 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1219 06:20:25.442782 2064791 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1219 06:20:25.443056 2064791 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1219 06:20:25.443098 2064791 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1219 06:20:25.586740 2064791 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1219 06:20:25.586852 2064791 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1219 06:24:25.586924 2064791 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.000209622s
	I1219 06:24:25.586949 2064791 kubeadm.go:319] 
	I1219 06:24:25.587005 2064791 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1219 06:24:25.587037 2064791 kubeadm.go:319] 	- The kubelet is not running
	I1219 06:24:25.587152 2064791 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1219 06:24:25.587157 2064791 kubeadm.go:319] 
	I1219 06:24:25.587305 2064791 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1219 06:24:25.587351 2064791 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1219 06:24:25.587399 2064791 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1219 06:24:25.587405 2064791 kubeadm.go:319] 
	I1219 06:24:25.592745 2064791 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1219 06:24:25.593206 2064791 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1219 06:24:25.593358 2064791 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1219 06:24:25.593654 2064791 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	I1219 06:24:25.593660 2064791 kubeadm.go:319] 
	I1219 06:24:25.593751 2064791 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1219 06:24:25.593818 2064791 kubeadm.go:403] duration metric: took 12m8.761907578s to StartCluster
	I1219 06:24:25.593849 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:24:25.593915 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:24:25.619076 2064791 cri.go:92] found id: ""
	I1219 06:24:25.619090 2064791 logs.go:282] 0 containers: []
	W1219 06:24:25.619097 2064791 logs.go:284] No container was found matching "kube-apiserver"
	I1219 06:24:25.619103 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:24:25.619166 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:24:25.645501 2064791 cri.go:92] found id: ""
	I1219 06:24:25.645515 2064791 logs.go:282] 0 containers: []
	W1219 06:24:25.645522 2064791 logs.go:284] No container was found matching "etcd"
	I1219 06:24:25.645527 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:24:25.645587 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:24:25.671211 2064791 cri.go:92] found id: ""
	I1219 06:24:25.671225 2064791 logs.go:282] 0 containers: []
	W1219 06:24:25.671232 2064791 logs.go:284] No container was found matching "coredns"
	I1219 06:24:25.671237 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:24:25.671297 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:24:25.695076 2064791 cri.go:92] found id: ""
	I1219 06:24:25.695090 2064791 logs.go:282] 0 containers: []
	W1219 06:24:25.695098 2064791 logs.go:284] No container was found matching "kube-scheduler"
	I1219 06:24:25.695104 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:24:25.695165 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:24:25.720717 2064791 cri.go:92] found id: ""
	I1219 06:24:25.720733 2064791 logs.go:282] 0 containers: []
	W1219 06:24:25.720740 2064791 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:24:25.720745 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:24:25.720832 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:24:25.746445 2064791 cri.go:92] found id: ""
	I1219 06:24:25.746460 2064791 logs.go:282] 0 containers: []
	W1219 06:24:25.746466 2064791 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 06:24:25.746478 2064791 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:24:25.746541 2064791 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:24:25.771217 2064791 cri.go:92] found id: ""
	I1219 06:24:25.771231 2064791 logs.go:282] 0 containers: []
	W1219 06:24:25.771238 2064791 logs.go:284] No container was found matching "kindnet"
	I1219 06:24:25.771249 2064791 logs.go:123] Gathering logs for kubelet ...
	I1219 06:24:25.771259 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:24:25.827848 2064791 logs.go:123] Gathering logs for dmesg ...
	I1219 06:24:25.827867 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:24:25.845454 2064791 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:24:25.845470 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:24:25.916464 2064791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:24:25.906852   20933 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:24:25.907635   20933 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:24:25.909247   20933 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:24:25.909952   20933 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:24:25.911845   20933 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1219 06:24:25.906852   20933 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:24:25.907635   20933 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:24:25.909247   20933 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:24:25.909952   20933 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:24:25.911845   20933 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:24:25.916485 2064791 logs.go:123] Gathering logs for containerd ...
	I1219 06:24:25.916495 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:24:25.988149 2064791 logs.go:123] Gathering logs for container status ...
	I1219 06:24:25.988168 2064791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	W1219 06:24:26.019538 2064791 out.go:434] Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000209622s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	
	To see the stack trace of this error execute with --v=5 or higher
	W1219 06:24:26.019579 2064791 out.go:285] * 
	W1219 06:24:26.019696 2064791 out.go:285] X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000209622s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1219 06:24:26.019769 2064791 out.go:285] * 
	W1219 06:24:26.022296 2064791 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1219 06:24:26.028311 2064791 out.go:203] 
	W1219 06:24:26.031204 2064791 out.go:285] X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000209622s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": dial tcp 127.0.0.1:10248: connect: connection refused
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1219 06:24:26.031251 2064791 out.go:285] * Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	W1219 06:24:26.031270 2064791 out.go:285] * Related issue: https://github.com/kubernetes/minikube/issues/4172
	I1219 06:24:26.034280 2064791 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.483798627Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.483867223Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.483965234Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.484038564Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.484104559Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.484166960Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.484226562Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.484289119Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.484361021Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.484452469Z" level=info msg="Connect containerd service"
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.484896289Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this"
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.485577404Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config"
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.498876654Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.499249089Z" level=info msg=serving... address=/run/containerd/containerd.sock
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.498953709Z" level=info msg="Start subscribing containerd event"
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.499359457Z" level=info msg="Start recovering state"
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.541527820Z" level=info msg="Start event monitor"
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.541744389Z" level=info msg="Start cni network conf syncer for default"
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.541814527Z" level=info msg="Start streaming server"
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.541876723Z" level=info msg="Registered namespace \"k8s.io\" with NRI"
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.541934873Z" level=info msg="runtime interface starting up..."
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.541989897Z" level=info msg="starting plugins..."
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.542066690Z" level=info msg="Synchronizing NRI (plugin) with current runtime state"
	Dec 19 06:12:15 functional-006924 systemd[1]: Started containerd.service - containerd container runtime.
	Dec 19 06:12:15 functional-006924 containerd[9691]: time="2025-12-19T06:12:15.544548112Z" level=info msg="containerd successfully booted in 0.093860s"
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:26:13.226625   22368 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:26:13.227016   22368 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:26:13.228562   22368 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:26:13.229218   22368 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:26:13.230812   22368 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec19 04:47] overlayfs: idmapped layers are currently not supported
	[Dec19 04:48] overlayfs: idmapped layers are currently not supported
	[Dec19 04:49] overlayfs: idmapped layers are currently not supported
	[Dec19 04:51] overlayfs: idmapped layers are currently not supported
	[Dec19 04:53] overlayfs: idmapped layers are currently not supported
	[Dec19 05:03] overlayfs: idmapped layers are currently not supported
	[Dec19 05:04] overlayfs: idmapped layers are currently not supported
	[Dec19 05:05] overlayfs: idmapped layers are currently not supported
	[Dec19 05:06] overlayfs: idmapped layers are currently not supported
	[ +12.793339] overlayfs: idmapped layers are currently not supported
	[Dec19 05:07] overlayfs: idmapped layers are currently not supported
	[Dec19 05:08] overlayfs: idmapped layers are currently not supported
	[Dec19 05:09] overlayfs: idmapped layers are currently not supported
	[Dec19 05:10] overlayfs: idmapped layers are currently not supported
	[Dec19 05:11] overlayfs: idmapped layers are currently not supported
	[Dec19 05:13] overlayfs: idmapped layers are currently not supported
	[Dec19 05:14] overlayfs: idmapped layers are currently not supported
	[Dec19 05:32] overlayfs: idmapped layers are currently not supported
	[Dec19 05:33] overlayfs: idmapped layers are currently not supported
	[Dec19 05:35] overlayfs: idmapped layers are currently not supported
	[Dec19 05:36] overlayfs: idmapped layers are currently not supported
	[Dec19 05:38] overlayfs: idmapped layers are currently not supported
	[Dec19 05:39] overlayfs: idmapped layers are currently not supported
	[Dec19 05:40] overlayfs: idmapped layers are currently not supported
	[Dec19 05:42] kauditd_printk_skb: 8 callbacks suppressed
	
	
	==> kernel <==
	 06:26:13 up 11:08,  0 user,  load average: 0.26, 0.19, 0.41
	Linux functional-006924 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 19 06:26:10 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 06:26:10 functional-006924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 460.
	Dec 19 06:26:10 functional-006924 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:26:10 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:26:10 functional-006924 kubelet[22252]: E1219 06:26:10.945389   22252 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 06:26:10 functional-006924 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 06:26:10 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 06:26:11 functional-006924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 461.
	Dec 19 06:26:11 functional-006924 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:26:11 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:26:11 functional-006924 kubelet[22257]: E1219 06:26:11.726350   22257 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 06:26:11 functional-006924 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 06:26:11 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 06:26:12 functional-006924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 462.
	Dec 19 06:26:12 functional-006924 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:26:12 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:26:12 functional-006924 kubelet[22278]: E1219 06:26:12.458449   22278 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 06:26:12 functional-006924 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 06:26:12 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 06:26:13 functional-006924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 463.
	Dec 19 06:26:13 functional-006924 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:26:13 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:26:13 functional-006924 kubelet[22361]: E1219 06:26:13.211724   22361 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 06:26:13 functional-006924 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 06:26:13 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:263: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-006924 -n functional-006924
helpers_test.go:263: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-006924 -n functional-006924: exit status 2 (364.970638ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:263: status error: exit status 2 (may be ok)
helpers_test.go:265: "functional-006924" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmdConnect (2.41s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim (241.68s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim
functional_test_pvc_test.go:50: (dbg) TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: waiting 4m0s for pods matching "integration-test=storage-provisioner" in namespace "kube-system" ...
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
I1219 06:24:44.372523 2000386 retry.go:31] will retry after 4.482617915s: Temporary Error: Get "http://10.107.195.190": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
I1219 06:24:58.855600 2000386 retry.go:31] will retry after 5.937509242s: Temporary Error: Get "http://10.107.195.190": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
I1219 06:25:14.794712 2000386 retry.go:31] will retry after 5.61545844s: Temporary Error: Get "http://10.107.195.190": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
E1219 06:25:27.486334 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/addons-064622/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
I1219 06:25:30.411242 2000386 retry.go:31] will retry after 5.537730411s: Temporary Error: Get "http://10.107.195.190": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
I1219 06:25:45.950578 2000386 retry.go:31] will retry after 15.535171817s: Temporary Error: Get "http://10.107.195.190": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:338: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
functional_test_pvc_test.go:50: ***** TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim: pod "integration-test=storage-provisioner" failed to start within 4m0s: context deadline exceeded ****
functional_test_pvc_test.go:50: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-006924 -n functional-006924
functional_test_pvc_test.go:50: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-006924 -n functional-006924: exit status 2 (339.608791ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
functional_test_pvc_test.go:50: status error: exit status 2 (may be ok)
functional_test_pvc_test.go:50: "functional-006924" apiserver is not running, skipping kubectl commands (state="Stopped")
functional_test_pvc_test.go:51: failed waiting for storage-provisioner: integration-test=storage-provisioner within 4m0s: context deadline exceeded
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:224: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim]: network settings <======
helpers_test.go:231: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:239: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim]: docker inspect <======
helpers_test.go:240: (dbg) Run:  docker inspect functional-006924
helpers_test.go:244: (dbg) docker inspect functional-006924:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6",
	        "Created": "2025-12-19T05:57:32.987616309Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 2053574,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-19T05:57:33.050252475Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:1411dfa4fea1291ce69fcd55acb99f3fbff3e701cee30fdd4f0b2561ac0ef6b0",
	        "ResolvConfPath": "/var/lib/docker/containers/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6/hostname",
	        "HostsPath": "/var/lib/docker/containers/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6/hosts",
	        "LogPath": "/var/lib/docker/containers/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6-json.log",
	        "Name": "/functional-006924",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-006924:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-006924",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6",
	                "LowerDir": "/var/lib/docker/overlay2/37e64f46ab514c62069e4338bcac4816059c7746e8935882d7526ec5f3a01f73-init/diff:/var/lib/docker/overlay2/00358d85eab3b52f9d297862c5ac97673efd866f7bb8f8781bf0c1744f50abc5/diff",
	                "MergedDir": "/var/lib/docker/overlay2/37e64f46ab514c62069e4338bcac4816059c7746e8935882d7526ec5f3a01f73/merged",
	                "UpperDir": "/var/lib/docker/overlay2/37e64f46ab514c62069e4338bcac4816059c7746e8935882d7526ec5f3a01f73/diff",
	                "WorkDir": "/var/lib/docker/overlay2/37e64f46ab514c62069e4338bcac4816059c7746e8935882d7526ec5f3a01f73/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "volume",
	                "Name": "functional-006924",
	                "Source": "/var/lib/docker/volumes/functional-006924/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            },
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-006924",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-006924",
	                "name.minikube.sigs.k8s.io": "functional-006924",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "c06ab2bd44169716d410789ed39ed6e7c04e20cbf7fddb96691439282b9c97ca",
	            "SandboxKey": "/var/run/docker/netns/c06ab2bd4416",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34704"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34705"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34708"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34706"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34707"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-006924": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "42:2f:87:6a:a8:7b",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "f63e8dc2cff83663f8a4d14108f192e61e457410fa4fc720cd9630dbf354815d",
	                    "EndpointID": "aa2b1cbd90d5c1f6130481423d97f82d974d4197e41ad0dbe3b7e51b22c8b4cc",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-006924",
	                        "651d0d6ef1db"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:248: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-006924 -n functional-006924
helpers_test.go:248: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-006924 -n functional-006924: exit status 2 (302.64516ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:248: status error: exit status 2 (may be ok)
helpers_test.go:253: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim FAILED: start of post-mortem logs <<<
helpers_test.go:254: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim]: minikube logs <======
helpers_test.go:256: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 logs -n 25
helpers_test.go:261: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim logs: 
-- stdout --
	
	==> Audit <==
	┌────────────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│    COMMAND     │                                                                              ARGS                                                                               │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├────────────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ image          │ functional-006924 image load --daemon kicbase/echo-server:functional-006924 --alsologtostderr                                                                   │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ image          │ functional-006924 image ls                                                                                                                                      │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ image          │ functional-006924 image save kicbase/echo-server:functional-006924 /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ image          │ functional-006924 image rm kicbase/echo-server:functional-006924 --alsologtostderr                                                                              │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ image          │ functional-006924 image ls                                                                                                                                      │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ image          │ functional-006924 image load /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr                                       │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ image          │ functional-006924 image ls                                                                                                                                      │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ image          │ functional-006924 image save --daemon kicbase/echo-server:functional-006924 --alsologtostderr                                                                   │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ ssh            │ functional-006924 ssh sudo cat /etc/ssl/certs/2000386.pem                                                                                                       │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ ssh            │ functional-006924 ssh sudo cat /usr/share/ca-certificates/2000386.pem                                                                                           │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ ssh            │ functional-006924 ssh sudo cat /etc/ssl/certs/51391683.0                                                                                                        │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ ssh            │ functional-006924 ssh sudo cat /etc/ssl/certs/20003862.pem                                                                                                      │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ ssh            │ functional-006924 ssh sudo cat /usr/share/ca-certificates/20003862.pem                                                                                          │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ ssh            │ functional-006924 ssh sudo cat /etc/ssl/certs/3ec20f2e.0                                                                                                        │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ ssh            │ functional-006924 ssh sudo cat /etc/test/nested/copy/2000386/hosts                                                                                              │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ image          │ functional-006924 image ls --format short --alsologtostderr                                                                                                     │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ image          │ functional-006924 image ls --format yaml --alsologtostderr                                                                                                      │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ ssh            │ functional-006924 ssh pgrep buildkitd                                                                                                                           │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │                     │
	│ image          │ functional-006924 image build -t localhost/my-image:functional-006924 testdata/build --alsologtostderr                                                          │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ image          │ functional-006924 image ls                                                                                                                                      │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ image          │ functional-006924 image ls --format json --alsologtostderr                                                                                                      │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ image          │ functional-006924 image ls --format table --alsologtostderr                                                                                                     │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ update-context │ functional-006924 update-context --alsologtostderr -v=2                                                                                                         │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ update-context │ functional-006924 update-context --alsologtostderr -v=2                                                                                                         │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ update-context │ functional-006924 update-context --alsologtostderr -v=2                                                                                                         │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	└────────────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/19 06:26:29
	Running on machine: ip-172-31-30-239
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1219 06:26:29.646762 2082108 out.go:360] Setting OutFile to fd 1 ...
	I1219 06:26:29.646954 2082108 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 06:26:29.646981 2082108 out.go:374] Setting ErrFile to fd 2...
	I1219 06:26:29.647003 2082108 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 06:26:29.647305 2082108 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22230-1998525/.minikube/bin
	I1219 06:26:29.652707 2082108 out.go:368] Setting JSON to false
	I1219 06:26:29.653574 2082108 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":40136,"bootTime":1766085454,"procs":158,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1219 06:26:29.653730 2082108 start.go:143] virtualization:  
	I1219 06:26:29.656825 2082108 out.go:179] * [functional-006924] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1219 06:26:29.660666 2082108 notify.go:221] Checking for updates...
	I1219 06:26:29.663753 2082108 out.go:179]   - MINIKUBE_LOCATION=22230
	I1219 06:26:29.666653 2082108 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1219 06:26:29.669563 2082108 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22230-1998525/kubeconfig
	I1219 06:26:29.672409 2082108 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22230-1998525/.minikube
	I1219 06:26:29.675294 2082108 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1219 06:26:29.680090 2082108 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1219 06:26:29.683366 2082108 config.go:182] Loaded profile config "functional-006924": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1219 06:26:29.683923 2082108 driver.go:422] Setting default libvirt URI to qemu:///system
	I1219 06:26:29.729856 2082108 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1219 06:26:29.729981 2082108 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1219 06:26:29.790364 2082108 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-19 06:26:29.781319903 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1219 06:26:29.790469 2082108 docker.go:319] overlay module found
	I1219 06:26:29.793548 2082108 out.go:179] * Using the docker driver based on existing profile
	I1219 06:26:29.796474 2082108 start.go:309] selected driver: docker
	I1219 06:26:29.796494 2082108 start.go:928] validating driver "docker" against &{Name:functional-006924 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APISer
verName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUI
D:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1219 06:26:29.796598 2082108 start.go:939] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1219 06:26:29.796707 2082108 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1219 06:26:29.856113 2082108 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-19 06:26:29.847123734 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1219 06:26:29.856550 2082108 cni.go:84] Creating CNI manager for ""
	I1219 06:26:29.856616 2082108 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1219 06:26:29.856657 2082108 start.go:353] cluster config:
	{Name:functional-006924 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Con
tainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCo
reDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1219 06:26:29.859809 2082108 out.go:179] * dry-run validation complete!
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 19 06:26:34 functional-006924 containerd[9691]: time="2025-12-19T06:26:34.695886291Z" level=info msg="ImageCreate event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 19 06:26:34 functional-006924 containerd[9691]: time="2025-12-19T06:26:34.696492517Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-006924\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 19 06:26:35 functional-006924 containerd[9691]: time="2025-12-19T06:26:35.775016443Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-006924\""
	Dec 19 06:26:35 functional-006924 containerd[9691]: time="2025-12-19T06:26:35.777843698Z" level=info msg="ImageDelete event name:\"docker.io/kicbase/echo-server:functional-006924\""
	Dec 19 06:26:35 functional-006924 containerd[9691]: time="2025-12-19T06:26:35.780155771Z" level=info msg="ImageDelete event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\""
	Dec 19 06:26:35 functional-006924 containerd[9691]: time="2025-12-19T06:26:35.789258916Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-006924\" returns successfully"
	Dec 19 06:26:36 functional-006924 containerd[9691]: time="2025-12-19T06:26:36.036820570Z" level=info msg="No images store for sha256:769e32ae5c7cfec09b4844dd5e67873bc3415c1e5826b930385086ee13345efc"
	Dec 19 06:26:36 functional-006924 containerd[9691]: time="2025-12-19T06:26:36.038979477Z" level=info msg="ImageCreate event name:\"docker.io/kicbase/echo-server:functional-006924\""
	Dec 19 06:26:36 functional-006924 containerd[9691]: time="2025-12-19T06:26:36.048443815Z" level=info msg="ImageCreate event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 19 06:26:36 functional-006924 containerd[9691]: time="2025-12-19T06:26:36.049194388Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-006924\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 19 06:26:36 functional-006924 containerd[9691]: time="2025-12-19T06:26:36.849024005Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-006924\""
	Dec 19 06:26:36 functional-006924 containerd[9691]: time="2025-12-19T06:26:36.851539355Z" level=info msg="ImageDelete event name:\"docker.io/kicbase/echo-server:functional-006924\""
	Dec 19 06:26:36 functional-006924 containerd[9691]: time="2025-12-19T06:26:36.853854538Z" level=info msg="ImageDelete event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\""
	Dec 19 06:26:36 functional-006924 containerd[9691]: time="2025-12-19T06:26:36.862435486Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-006924\" returns successfully"
	Dec 19 06:26:37 functional-006924 containerd[9691]: time="2025-12-19T06:26:37.520420426Z" level=info msg="No images store for sha256:203ff4e0ac9a4ee4e13df6655d4456a33bf5c3c3239baf23b1bed83c5252f7cb"
	Dec 19 06:26:37 functional-006924 containerd[9691]: time="2025-12-19T06:26:37.522696421Z" level=info msg="ImageCreate event name:\"docker.io/kicbase/echo-server:functional-006924\""
	Dec 19 06:26:37 functional-006924 containerd[9691]: time="2025-12-19T06:26:37.530262565Z" level=info msg="ImageCreate event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 19 06:26:37 functional-006924 containerd[9691]: time="2025-12-19T06:26:37.531052112Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-006924\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 19 06:26:45 functional-006924 containerd[9691]: time="2025-12-19T06:26:45.376448975Z" level=info msg="connecting to shim w2yjzeggvnhphvtprbjle76ly" address="unix:///run/containerd/s/893cc859981aa6c0d6ed6e815351337b83d6cc694b88d6ac0a1b9c6b1aea5e14" namespace=k8s.io protocol=ttrpc version=3
	Dec 19 06:26:45 functional-006924 containerd[9691]: time="2025-12-19T06:26:45.454334599Z" level=info msg="shim disconnected" id=w2yjzeggvnhphvtprbjle76ly namespace=k8s.io
	Dec 19 06:26:45 functional-006924 containerd[9691]: time="2025-12-19T06:26:45.454375232Z" level=info msg="cleaning up after shim disconnected" id=w2yjzeggvnhphvtprbjle76ly namespace=k8s.io
	Dec 19 06:26:45 functional-006924 containerd[9691]: time="2025-12-19T06:26:45.454385833Z" level=info msg="cleaning up dead shim" id=w2yjzeggvnhphvtprbjle76ly namespace=k8s.io
	Dec 19 06:26:45 functional-006924 containerd[9691]: time="2025-12-19T06:26:45.723907686Z" level=info msg="ImageCreate event name:\"localhost/my-image:functional-006924\""
	Dec 19 06:26:45 functional-006924 containerd[9691]: time="2025-12-19T06:26:45.730133623Z" level=info msg="ImageCreate event name:\"sha256:6505c5ca2c8ea31d3c7a79e95c6328d648b53a66c622dff2fd9f8335dbe3084d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 19 06:26:45 functional-006924 containerd[9691]: time="2025-12-19T06:26:45.730681493Z" level=info msg="ImageUpdate event name:\"localhost/my-image:functional-006924\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:28:36.069815   25140 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:28:36.070610   25140 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:28:36.072434   25140 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:28:36.073121   25140 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:28:36.074862   25140 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec19 04:47] overlayfs: idmapped layers are currently not supported
	[Dec19 04:48] overlayfs: idmapped layers are currently not supported
	[Dec19 04:49] overlayfs: idmapped layers are currently not supported
	[Dec19 04:51] overlayfs: idmapped layers are currently not supported
	[Dec19 04:53] overlayfs: idmapped layers are currently not supported
	[Dec19 05:03] overlayfs: idmapped layers are currently not supported
	[Dec19 05:04] overlayfs: idmapped layers are currently not supported
	[Dec19 05:05] overlayfs: idmapped layers are currently not supported
	[Dec19 05:06] overlayfs: idmapped layers are currently not supported
	[ +12.793339] overlayfs: idmapped layers are currently not supported
	[Dec19 05:07] overlayfs: idmapped layers are currently not supported
	[Dec19 05:08] overlayfs: idmapped layers are currently not supported
	[Dec19 05:09] overlayfs: idmapped layers are currently not supported
	[Dec19 05:10] overlayfs: idmapped layers are currently not supported
	[Dec19 05:11] overlayfs: idmapped layers are currently not supported
	[Dec19 05:13] overlayfs: idmapped layers are currently not supported
	[Dec19 05:14] overlayfs: idmapped layers are currently not supported
	[Dec19 05:32] overlayfs: idmapped layers are currently not supported
	[Dec19 05:33] overlayfs: idmapped layers are currently not supported
	[Dec19 05:35] overlayfs: idmapped layers are currently not supported
	[Dec19 05:36] overlayfs: idmapped layers are currently not supported
	[Dec19 05:38] overlayfs: idmapped layers are currently not supported
	[Dec19 05:39] overlayfs: idmapped layers are currently not supported
	[Dec19 05:40] overlayfs: idmapped layers are currently not supported
	[Dec19 05:42] kauditd_printk_skb: 8 callbacks suppressed
	
	
	==> kernel <==
	 06:28:36 up 11:11,  0 user,  load average: 0.37, 0.33, 0.44
	Linux functional-006924 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 19 06:28:32 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 06:28:33 functional-006924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 650.
	Dec 19 06:28:33 functional-006924 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:28:33 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:28:33 functional-006924 kubelet[25009]: E1219 06:28:33.443362   25009 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 06:28:33 functional-006924 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 06:28:33 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 06:28:34 functional-006924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 651.
	Dec 19 06:28:34 functional-006924 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:28:34 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:28:34 functional-006924 kubelet[25015]: E1219 06:28:34.191599   25015 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 06:28:34 functional-006924 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 06:28:34 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 06:28:34 functional-006924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 652.
	Dec 19 06:28:34 functional-006924 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:28:34 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:28:34 functional-006924 kubelet[25020]: E1219 06:28:34.969733   25020 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 06:28:34 functional-006924 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 06:28:34 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 06:28:35 functional-006924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 653.
	Dec 19 06:28:35 functional-006924 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:28:35 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:28:35 functional-006924 kubelet[25056]: E1219 06:28:35.705060   25056 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 06:28:35 functional-006924 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 06:28:35 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:263: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-006924 -n functional-006924
helpers_test.go:263: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-006924 -n functional-006924: exit status 2 (337.401563ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:263: status error: exit status 2 (may be ok)
helpers_test.go:265: "functional-006924" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PersistentVolumeClaim (241.68s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NodeLabels (1.41s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NodeLabels
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NodeLabels

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NodeLabels
functional_test.go:234: (dbg) Run:  kubectl --context functional-006924 get nodes --output=go-template "--template='{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'"
functional_test.go:234: (dbg) Non-zero exit: kubectl --context functional-006924 get nodes --output=go-template "--template='{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'": exit status 1 (60.552748ms)

                                                
                                                
-- stdout --
	'Error executing template: template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range. Printing more information for debugging the template:
		template was:
			'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'
		raw data was:
			{"apiVersion":"v1","items":[],"kind":"List","metadata":{"resourceVersion":""}}
		object given to template engine was:
			map[apiVersion:v1 items:[] kind:List metadata:map[resourceVersion:]]
	

                                                
                                                
-- /stdout --
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?
	error executing template "'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'": template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range

                                                
                                                
** /stderr **
functional_test.go:236: failed to 'kubectl get nodes' with args "kubectl --context functional-006924 get nodes --output=go-template \"--template='{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'\"": exit status 1
functional_test.go:242: expected to have label "minikube.k8s.io/commit" in node labels but got : 
-- stdout --
	'Error executing template: template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range. Printing more information for debugging the template:
		template was:
			'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'
		raw data was:
			{"apiVersion":"v1","items":[],"kind":"List","metadata":{"resourceVersion":""}}
		object given to template engine was:
			map[apiVersion:v1 items:[] kind:List metadata:map[resourceVersion:]]
	

                                                
                                                
-- /stdout --
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?
	error executing template "'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'": template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range

                                                
                                                
** /stderr **
functional_test.go:242: expected to have label "minikube.k8s.io/version" in node labels but got : 
-- stdout --
	'Error executing template: template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range. Printing more information for debugging the template:
		template was:
			'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'
		raw data was:
			{"apiVersion":"v1","items":[],"kind":"List","metadata":{"resourceVersion":""}}
		object given to template engine was:
			map[apiVersion:v1 items:[] kind:List metadata:map[resourceVersion:]]
	

                                                
                                                
-- /stdout --
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?
	error executing template "'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'": template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range

                                                
                                                
** /stderr **
functional_test.go:242: expected to have label "minikube.k8s.io/updated_at" in node labels but got : 
-- stdout --
	'Error executing template: template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range. Printing more information for debugging the template:
		template was:
			'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'
		raw data was:
			{"apiVersion":"v1","items":[],"kind":"List","metadata":{"resourceVersion":""}}
		object given to template engine was:
			map[apiVersion:v1 items:[] kind:List metadata:map[resourceVersion:]]
	

                                                
                                                
-- /stdout --
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?
	error executing template "'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'": template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range

                                                
                                                
** /stderr **
functional_test.go:242: expected to have label "minikube.k8s.io/name" in node labels but got : 
-- stdout --
	'Error executing template: template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range. Printing more information for debugging the template:
		template was:
			'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'
		raw data was:
			{"apiVersion":"v1","items":[],"kind":"List","metadata":{"resourceVersion":""}}
		object given to template engine was:
			map[apiVersion:v1 items:[] kind:List metadata:map[resourceVersion:]]
	

                                                
                                                
-- /stdout --
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?
	error executing template "'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'": template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range

                                                
                                                
** /stderr **
functional_test.go:242: expected to have label "minikube.k8s.io/primary" in node labels but got : 
-- stdout --
	'Error executing template: template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range. Printing more information for debugging the template:
		template was:
			'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'
		raw data was:
			{"apiVersion":"v1","items":[],"kind":"List","metadata":{"resourceVersion":""}}
		object given to template engine was:
			map[apiVersion:v1 items:[] kind:List metadata:map[resourceVersion:]]
	

                                                
                                                
-- /stdout --
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?
	error executing template "'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'": template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range

                                                
                                                
** /stderr **
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:224: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NodeLabels]: network settings <======
helpers_test.go:231: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:239: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NodeLabels]: docker inspect <======
helpers_test.go:240: (dbg) Run:  docker inspect functional-006924
helpers_test.go:244: (dbg) docker inspect functional-006924:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6",
	        "Created": "2025-12-19T05:57:32.987616309Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 2053574,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-19T05:57:33.050252475Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:1411dfa4fea1291ce69fcd55acb99f3fbff3e701cee30fdd4f0b2561ac0ef6b0",
	        "ResolvConfPath": "/var/lib/docker/containers/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6/hostname",
	        "HostsPath": "/var/lib/docker/containers/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6/hosts",
	        "LogPath": "/var/lib/docker/containers/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6/651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6-json.log",
	        "Name": "/functional-006924",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-006924:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-006924",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "651d0d6ef1db4974dc33ac895a5216846a242813d963b10058150c93b1e4cae6",
	                "LowerDir": "/var/lib/docker/overlay2/37e64f46ab514c62069e4338bcac4816059c7746e8935882d7526ec5f3a01f73-init/diff:/var/lib/docker/overlay2/00358d85eab3b52f9d297862c5ac97673efd866f7bb8f8781bf0c1744f50abc5/diff",
	                "MergedDir": "/var/lib/docker/overlay2/37e64f46ab514c62069e4338bcac4816059c7746e8935882d7526ec5f3a01f73/merged",
	                "UpperDir": "/var/lib/docker/overlay2/37e64f46ab514c62069e4338bcac4816059c7746e8935882d7526ec5f3a01f73/diff",
	                "WorkDir": "/var/lib/docker/overlay2/37e64f46ab514c62069e4338bcac4816059c7746e8935882d7526ec5f3a01f73/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "volume",
	                "Name": "functional-006924",
	                "Source": "/var/lib/docker/volumes/functional-006924/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            },
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-006924",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-006924",
	                "name.minikube.sigs.k8s.io": "functional-006924",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "c06ab2bd44169716d410789ed39ed6e7c04e20cbf7fddb96691439282b9c97ca",
	            "SandboxKey": "/var/run/docker/netns/c06ab2bd4416",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34704"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34705"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34708"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34706"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34707"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-006924": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "42:2f:87:6a:a8:7b",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "f63e8dc2cff83663f8a4d14108f192e61e457410fa4fc720cd9630dbf354815d",
	                    "EndpointID": "aa2b1cbd90d5c1f6130481423d97f82d974d4197e41ad0dbe3b7e51b22c8b4cc",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-006924",
	                        "651d0d6ef1db"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:248: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-006924 -n functional-006924
helpers_test.go:248: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-006924 -n functional-006924: exit status 2 (302.751974ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:248: status error: exit status 2 (may be ok)
helpers_test.go:253: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NodeLabels FAILED: start of post-mortem logs <<<
helpers_test.go:254: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NodeLabels]: minikube logs <======
helpers_test.go:256: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 logs -n 25
helpers_test.go:261: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NodeLabels logs: 
-- stdout --
	
	==> Audit <==
	┌───────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│  COMMAND  │                                                                              ARGS                                                                               │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├───────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ ssh       │ functional-006924 ssh findmnt -T /mount1                                                                                                                        │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │                     │
	│ mount     │ -p functional-006924 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun2343003325/001:/mount3 --alsologtostderr -v=1                            │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │                     │
	│ ssh       │ functional-006924 ssh findmnt -T /mount1                                                                                                                        │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ ssh       │ functional-006924 ssh findmnt -T /mount2                                                                                                                        │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ ssh       │ functional-006924 ssh findmnt -T /mount3                                                                                                                        │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ mount     │ -p functional-006924 --kill=true                                                                                                                                │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │                     │
	│ start     │ -p functional-006924 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-rc.1               │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │                     │
	│ start     │ -p functional-006924 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-rc.1               │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │                     │
	│ start     │ -p functional-006924 --dry-run --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-rc.1                         │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │                     │
	│ dashboard │ --url --port 36195 -p functional-006924 --alsologtostderr -v=1                                                                                                  │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │                     │
	│ license   │                                                                                                                                                                 │ minikube          │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ ssh       │ functional-006924 ssh sudo systemctl is-active docker                                                                                                           │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │                     │
	│ ssh       │ functional-006924 ssh sudo systemctl is-active crio                                                                                                             │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │                     │
	│ image     │ functional-006924 image load --daemon kicbase/echo-server:functional-006924 --alsologtostderr                                                                   │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ image     │ functional-006924 image ls                                                                                                                                      │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ image     │ functional-006924 image load --daemon kicbase/echo-server:functional-006924 --alsologtostderr                                                                   │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ image     │ functional-006924 image ls                                                                                                                                      │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ image     │ functional-006924 image load --daemon kicbase/echo-server:functional-006924 --alsologtostderr                                                                   │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ image     │ functional-006924 image ls                                                                                                                                      │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ image     │ functional-006924 image save kicbase/echo-server:functional-006924 /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ image     │ functional-006924 image rm kicbase/echo-server:functional-006924 --alsologtostderr                                                                              │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ image     │ functional-006924 image ls                                                                                                                                      │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ image     │ functional-006924 image load /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr                                       │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ image     │ functional-006924 image ls                                                                                                                                      │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	│ image     │ functional-006924 image save --daemon kicbase/echo-server:functional-006924 --alsologtostderr                                                                   │ functional-006924 │ jenkins │ v1.37.0 │ 19 Dec 25 06:26 UTC │ 19 Dec 25 06:26 UTC │
	└───────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/19 06:26:29
	Running on machine: ip-172-31-30-239
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1219 06:26:29.646762 2082108 out.go:360] Setting OutFile to fd 1 ...
	I1219 06:26:29.646954 2082108 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 06:26:29.646981 2082108 out.go:374] Setting ErrFile to fd 2...
	I1219 06:26:29.647003 2082108 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 06:26:29.647305 2082108 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22230-1998525/.minikube/bin
	I1219 06:26:29.652707 2082108 out.go:368] Setting JSON to false
	I1219 06:26:29.653574 2082108 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":40136,"bootTime":1766085454,"procs":158,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1219 06:26:29.653730 2082108 start.go:143] virtualization:  
	I1219 06:26:29.656825 2082108 out.go:179] * [functional-006924] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1219 06:26:29.660666 2082108 notify.go:221] Checking for updates...
	I1219 06:26:29.663753 2082108 out.go:179]   - MINIKUBE_LOCATION=22230
	I1219 06:26:29.666653 2082108 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1219 06:26:29.669563 2082108 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22230-1998525/kubeconfig
	I1219 06:26:29.672409 2082108 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22230-1998525/.minikube
	I1219 06:26:29.675294 2082108 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1219 06:26:29.680090 2082108 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1219 06:26:29.683366 2082108 config.go:182] Loaded profile config "functional-006924": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1219 06:26:29.683923 2082108 driver.go:422] Setting default libvirt URI to qemu:///system
	I1219 06:26:29.729856 2082108 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1219 06:26:29.729981 2082108 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1219 06:26:29.790364 2082108 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-19 06:26:29.781319903 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1219 06:26:29.790469 2082108 docker.go:319] overlay module found
	I1219 06:26:29.793548 2082108 out.go:179] * Using the docker driver based on existing profile
	I1219 06:26:29.796474 2082108 start.go:309] selected driver: docker
	I1219 06:26:29.796494 2082108 start.go:928] validating driver "docker" against &{Name:functional-006924 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APISer
verName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUI
D:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1219 06:26:29.796598 2082108 start.go:939] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1219 06:26:29.796707 2082108 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1219 06:26:29.856113 2082108 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-19 06:26:29.847123734 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1219 06:26:29.856550 2082108 cni.go:84] Creating CNI manager for ""
	I1219 06:26:29.856616 2082108 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1219 06:26:29.856657 2082108 start.go:353] cluster config:
	{Name:functional-006924 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Con
tainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCo
reDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1219 06:26:29.859809 2082108 out.go:179] * dry-run validation complete!
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 19 06:26:33 functional-006924 containerd[9691]: time="2025-12-19T06:26:33.605966380Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-006924\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 19 06:26:34 functional-006924 containerd[9691]: time="2025-12-19T06:26:34.438713286Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-006924\""
	Dec 19 06:26:34 functional-006924 containerd[9691]: time="2025-12-19T06:26:34.441670486Z" level=info msg="ImageDelete event name:\"docker.io/kicbase/echo-server:functional-006924\""
	Dec 19 06:26:34 functional-006924 containerd[9691]: time="2025-12-19T06:26:34.444457806Z" level=info msg="ImageDelete event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\""
	Dec 19 06:26:34 functional-006924 containerd[9691]: time="2025-12-19T06:26:34.456652036Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-006924\" returns successfully"
	Dec 19 06:26:34 functional-006924 containerd[9691]: time="2025-12-19T06:26:34.685982638Z" level=info msg="No images store for sha256:769e32ae5c7cfec09b4844dd5e67873bc3415c1e5826b930385086ee13345efc"
	Dec 19 06:26:34 functional-006924 containerd[9691]: time="2025-12-19T06:26:34.688290412Z" level=info msg="ImageCreate event name:\"docker.io/kicbase/echo-server:functional-006924\""
	Dec 19 06:26:34 functional-006924 containerd[9691]: time="2025-12-19T06:26:34.695886291Z" level=info msg="ImageCreate event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 19 06:26:34 functional-006924 containerd[9691]: time="2025-12-19T06:26:34.696492517Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-006924\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 19 06:26:35 functional-006924 containerd[9691]: time="2025-12-19T06:26:35.775016443Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-006924\""
	Dec 19 06:26:35 functional-006924 containerd[9691]: time="2025-12-19T06:26:35.777843698Z" level=info msg="ImageDelete event name:\"docker.io/kicbase/echo-server:functional-006924\""
	Dec 19 06:26:35 functional-006924 containerd[9691]: time="2025-12-19T06:26:35.780155771Z" level=info msg="ImageDelete event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\""
	Dec 19 06:26:35 functional-006924 containerd[9691]: time="2025-12-19T06:26:35.789258916Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-006924\" returns successfully"
	Dec 19 06:26:36 functional-006924 containerd[9691]: time="2025-12-19T06:26:36.036820570Z" level=info msg="No images store for sha256:769e32ae5c7cfec09b4844dd5e67873bc3415c1e5826b930385086ee13345efc"
	Dec 19 06:26:36 functional-006924 containerd[9691]: time="2025-12-19T06:26:36.038979477Z" level=info msg="ImageCreate event name:\"docker.io/kicbase/echo-server:functional-006924\""
	Dec 19 06:26:36 functional-006924 containerd[9691]: time="2025-12-19T06:26:36.048443815Z" level=info msg="ImageCreate event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 19 06:26:36 functional-006924 containerd[9691]: time="2025-12-19T06:26:36.049194388Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-006924\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 19 06:26:36 functional-006924 containerd[9691]: time="2025-12-19T06:26:36.849024005Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-006924\""
	Dec 19 06:26:36 functional-006924 containerd[9691]: time="2025-12-19T06:26:36.851539355Z" level=info msg="ImageDelete event name:\"docker.io/kicbase/echo-server:functional-006924\""
	Dec 19 06:26:36 functional-006924 containerd[9691]: time="2025-12-19T06:26:36.853854538Z" level=info msg="ImageDelete event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\""
	Dec 19 06:26:36 functional-006924 containerd[9691]: time="2025-12-19T06:26:36.862435486Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-006924\" returns successfully"
	Dec 19 06:26:37 functional-006924 containerd[9691]: time="2025-12-19T06:26:37.520420426Z" level=info msg="No images store for sha256:203ff4e0ac9a4ee4e13df6655d4456a33bf5c3c3239baf23b1bed83c5252f7cb"
	Dec 19 06:26:37 functional-006924 containerd[9691]: time="2025-12-19T06:26:37.522696421Z" level=info msg="ImageCreate event name:\"docker.io/kicbase/echo-server:functional-006924\""
	Dec 19 06:26:37 functional-006924 containerd[9691]: time="2025-12-19T06:26:37.530262565Z" level=info msg="ImageCreate event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 19 06:26:37 functional-006924 containerd[9691]: time="2025-12-19T06:26:37.531052112Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-006924\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1219 06:26:39.176249   23767 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:26:39.176664   23767 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:26:39.178339   23767 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:26:39.178692   23767 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1219 06:26:39.180148   23767 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec19 04:47] overlayfs: idmapped layers are currently not supported
	[Dec19 04:48] overlayfs: idmapped layers are currently not supported
	[Dec19 04:49] overlayfs: idmapped layers are currently not supported
	[Dec19 04:51] overlayfs: idmapped layers are currently not supported
	[Dec19 04:53] overlayfs: idmapped layers are currently not supported
	[Dec19 05:03] overlayfs: idmapped layers are currently not supported
	[Dec19 05:04] overlayfs: idmapped layers are currently not supported
	[Dec19 05:05] overlayfs: idmapped layers are currently not supported
	[Dec19 05:06] overlayfs: idmapped layers are currently not supported
	[ +12.793339] overlayfs: idmapped layers are currently not supported
	[Dec19 05:07] overlayfs: idmapped layers are currently not supported
	[Dec19 05:08] overlayfs: idmapped layers are currently not supported
	[Dec19 05:09] overlayfs: idmapped layers are currently not supported
	[Dec19 05:10] overlayfs: idmapped layers are currently not supported
	[Dec19 05:11] overlayfs: idmapped layers are currently not supported
	[Dec19 05:13] overlayfs: idmapped layers are currently not supported
	[Dec19 05:14] overlayfs: idmapped layers are currently not supported
	[Dec19 05:32] overlayfs: idmapped layers are currently not supported
	[Dec19 05:33] overlayfs: idmapped layers are currently not supported
	[Dec19 05:35] overlayfs: idmapped layers are currently not supported
	[Dec19 05:36] overlayfs: idmapped layers are currently not supported
	[Dec19 05:38] overlayfs: idmapped layers are currently not supported
	[Dec19 05:39] overlayfs: idmapped layers are currently not supported
	[Dec19 05:40] overlayfs: idmapped layers are currently not supported
	[Dec19 05:42] kauditd_printk_skb: 8 callbacks suppressed
	
	
	==> kernel <==
	 06:26:39 up 11:09,  0 user,  load average: 0.91, 0.35, 0.46
	Linux functional-006924 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 19 06:26:35 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 06:26:36 functional-006924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 494.
	Dec 19 06:26:36 functional-006924 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:26:36 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:26:36 functional-006924 kubelet[23514]: E1219 06:26:36.464802   23514 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 06:26:36 functional-006924 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 06:26:36 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 06:26:37 functional-006924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 495.
	Dec 19 06:26:37 functional-006924 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:26:37 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:26:37 functional-006924 kubelet[23582]: E1219 06:26:37.219621   23582 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 06:26:37 functional-006924 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 06:26:37 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 06:26:37 functional-006924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 496.
	Dec 19 06:26:37 functional-006924 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:26:37 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:26:37 functional-006924 kubelet[23632]: E1219 06:26:37.955622   23632 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 06:26:37 functional-006924 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 06:26:37 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 06:26:38 functional-006924 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 497.
	Dec 19 06:26:38 functional-006924 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:26:38 functional-006924 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 06:26:38 functional-006924 kubelet[23683]: E1219 06:26:38.710018   23683 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 06:26:38 functional-006924 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 06:26:38 functional-006924 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:263: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-006924 -n functional-006924
helpers_test.go:263: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-006924 -n functional-006924: exit status 2 (313.471439ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:263: status error: exit status 2 (may be ok)
helpers_test.go:265: "functional-006924" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NodeLabels (1.41s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/RunSecondTunnel (0.46s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/RunSecondTunnel
functional_test_tunnel_test.go:154: (dbg) daemon: [out/minikube-linux-arm64 -p functional-006924 tunnel --alsologtostderr]
functional_test_tunnel_test.go:154: (dbg) daemon: [out/minikube-linux-arm64 -p functional-006924 tunnel --alsologtostderr]
functional_test_tunnel_test.go:190: tunnel command failed with unexpected error: exit code 103. stderr: I1219 06:24:33.890803 2077844 out.go:360] Setting OutFile to fd 1 ...
I1219 06:24:33.890898 2077844 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1219 06:24:33.890937 2077844 out.go:374] Setting ErrFile to fd 2...
I1219 06:24:33.890950 2077844 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1219 06:24:33.891208 2077844 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22230-1998525/.minikube/bin
I1219 06:24:33.891464 2077844 mustload.go:66] Loading cluster: functional-006924
I1219 06:24:33.891873 2077844 config.go:182] Loaded profile config "functional-006924": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
I1219 06:24:33.892326 2077844 cli_runner.go:164] Run: docker container inspect functional-006924 --format={{.State.Status}}
I1219 06:24:33.929604 2077844 host.go:66] Checking if "functional-006924" exists ...
I1219 06:24:33.929945 2077844 cli_runner.go:164] Run: docker system info --format "{{json .}}"
I1219 06:24:34.059979 2077844 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-19 06:24:34.042202032 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:aa
rch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pa
th:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
I1219 06:24:34.060114 2077844 api_server.go:166] Checking apiserver status ...
I1219 06:24:34.060170 2077844 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
I1219 06:24:34.060296 2077844 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
I1219 06:24:34.082743 2077844 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
W1219 06:24:34.214559 2077844 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
stdout:

                                                
                                                
stderr:
I1219 06:24:34.218013 2077844 out.go:179] * The control-plane node functional-006924 apiserver is not running: (state=Stopped)
I1219 06:24:34.221275 2077844 out.go:179]   To start a cluster, run: "minikube start -p functional-006924"

                                                
                                                
stdout: * The control-plane node functional-006924 apiserver is not running: (state=Stopped)
To start a cluster, run: "minikube start -p functional-006924"
functional_test_tunnel_test.go:194: (dbg) stopping [out/minikube-linux-arm64 -p functional-006924 tunnel --alsologtostderr] ...
helpers_test.go:520: unable to terminate pid 2077845: os: process already finished
functional_test_tunnel_test.go:194: read stdout failed: read |0: file already closed
functional_test_tunnel_test.go:194: (dbg) [out/minikube-linux-arm64 -p functional-006924 tunnel --alsologtostderr] stdout:
functional_test_tunnel_test.go:194: read stderr failed: read |0: file already closed
functional_test_tunnel_test.go:194: (dbg) [out/minikube-linux-arm64 -p functional-006924 tunnel --alsologtostderr] stderr:
functional_test_tunnel_test.go:194: (dbg) stopping [out/minikube-linux-arm64 -p functional-006924 tunnel --alsologtostderr] ...
helpers_test.go:508: unable to find parent, assuming dead: process does not exist
functional_test_tunnel_test.go:194: read stdout failed: read |0: file already closed
functional_test_tunnel_test.go:194: (dbg) [out/minikube-linux-arm64 -p functional-006924 tunnel --alsologtostderr] stdout:
functional_test_tunnel_test.go:194: read stderr failed: read |0: file already closed
functional_test_tunnel_test.go:194: (dbg) [out/minikube-linux-arm64 -p functional-006924 tunnel --alsologtostderr] stderr:
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/RunSecondTunnel (0.46s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/WaitService/Setup (0.13s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/WaitService/Setup
functional_test_tunnel_test.go:212: (dbg) Run:  kubectl --context functional-006924 apply -f testdata/testsvc.yaml
functional_test_tunnel_test.go:212: (dbg) Non-zero exit: kubectl --context functional-006924 apply -f testdata/testsvc.yaml: exit status 1 (125.726145ms)

                                                
                                                
** stderr ** 
	error: error validating "testdata/testsvc.yaml": error validating data: failed to download openapi: Get "https://192.168.49.2:8441/openapi/v2?timeout=32s": dial tcp 192.168.49.2:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false

                                                
                                                
** /stderr **
functional_test_tunnel_test.go:214: kubectl --context functional-006924 apply -f testdata/testsvc.yaml failed: exit status 1
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/WaitService/Setup (0.13s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/AccessDirect (97.18s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/AccessDirect
functional_test_tunnel_test.go:288: failed to hit nginx at "http://10.107.195.190": Temporary Error: Get "http://10.107.195.190": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
functional_test_tunnel_test.go:290: (dbg) Run:  kubectl --context functional-006924 get svc nginx-svc
functional_test_tunnel_test.go:290: (dbg) Non-zero exit: kubectl --context functional-006924 get svc nginx-svc: exit status 1 (66.563446ms)

                                                
                                                
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test_tunnel_test.go:292: kubectl --context functional-006924 get svc nginx-svc failed: exit status 1
functional_test_tunnel_test.go:294: failed to kubectl get svc nginx-svc:
functional_test_tunnel_test.go:301: expected body to contain "Welcome to nginx!", but got *""*
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/AccessDirect (97.18s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/DeployApp (0.06s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/DeployApp
functional_test.go:1451: (dbg) Run:  kubectl --context functional-006924 create deployment hello-node --image kicbase/echo-server
functional_test.go:1451: (dbg) Non-zero exit: kubectl --context functional-006924 create deployment hello-node --image kicbase/echo-server: exit status 1 (57.13078ms)

                                                
                                                
** stderr ** 
	error: failed to create deployment: Post "https://192.168.49.2:8441/apis/apps/v1/namespaces/default/deployments?fieldManager=kubectl-create&fieldValidation=Strict": dial tcp 192.168.49.2:8441: connect: connection refused

                                                
                                                
** /stderr **
functional_test.go:1453: failed to create hello-node deployment with this command "kubectl --context functional-006924 create deployment hello-node --image kicbase/echo-server": exit status 1.
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/DeployApp (0.06s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/List (0.26s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/List
functional_test.go:1469: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 service list
functional_test.go:1469: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-006924 service list: exit status 103 (255.65336ms)

                                                
                                                
-- stdout --
	* The control-plane node functional-006924 apiserver is not running: (state=Stopped)
	  To start a cluster, run: "minikube start -p functional-006924"

                                                
                                                
-- /stdout --
functional_test.go:1471: failed to do service list. args "out/minikube-linux-arm64 -p functional-006924 service list" : exit status 103
functional_test.go:1474: expected 'service list' to contain *hello-node* but got -"* The control-plane node functional-006924 apiserver is not running: (state=Stopped)\n  To start a cluster, run: \"minikube start -p functional-006924\"\n"-
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/List (0.26s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/JSONOutput (0.28s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/JSONOutput
functional_test.go:1499: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 service list -o json
functional_test.go:1499: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-006924 service list -o json: exit status 103 (280.599101ms)

                                                
                                                
-- stdout --
	* The control-plane node functional-006924 apiserver is not running: (state=Stopped)
	  To start a cluster, run: "minikube start -p functional-006924"

                                                
                                                
-- /stdout --
functional_test.go:1501: failed to list services with json format. args "out/minikube-linux-arm64 -p functional-006924 service list -o json": exit status 103
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/JSONOutput (0.28s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/HTTPS (0.28s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/HTTPS
functional_test.go:1519: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 service --namespace=default --https --url hello-node
functional_test.go:1519: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-006924 service --namespace=default --https --url hello-node: exit status 103 (280.066115ms)

                                                
                                                
-- stdout --
	* The control-plane node functional-006924 apiserver is not running: (state=Stopped)
	  To start a cluster, run: "minikube start -p functional-006924"

                                                
                                                
-- /stdout --
functional_test.go:1521: failed to get service url. args "out/minikube-linux-arm64 -p functional-006924 service --namespace=default --https --url hello-node" : exit status 103
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/HTTPS (0.28s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/Format (0.26s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/Format
functional_test.go:1550: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 service hello-node --url --format={{.IP}}
functional_test.go:1550: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-006924 service hello-node --url --format={{.IP}}: exit status 103 (262.489568ms)

                                                
                                                
-- stdout --
	* The control-plane node functional-006924 apiserver is not running: (state=Stopped)
	  To start a cluster, run: "minikube start -p functional-006924"

                                                
                                                
-- /stdout --
functional_test.go:1552: failed to get service url with custom format. args "out/minikube-linux-arm64 -p functional-006924 service hello-node --url --format={{.IP}}": exit status 103
functional_test.go:1558: "* The control-plane node functional-006924 apiserver is not running: (state=Stopped)\n  To start a cluster, run: \"minikube start -p functional-006924\"" is not a valid IP
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/Format (0.26s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/URL (0.3s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/URL
functional_test.go:1569: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 service hello-node --url
functional_test.go:1569: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-006924 service hello-node --url: exit status 103 (296.493973ms)

                                                
                                                
-- stdout --
	* The control-plane node functional-006924 apiserver is not running: (state=Stopped)
	  To start a cluster, run: "minikube start -p functional-006924"

                                                
                                                
-- /stdout --
functional_test.go:1571: failed to get service url. args: "out/minikube-linux-arm64 -p functional-006924 service hello-node --url": exit status 103
functional_test.go:1575: found endpoint for hello-node: * The control-plane node functional-006924 apiserver is not running: (state=Stopped)
To start a cluster, run: "minikube start -p functional-006924"
functional_test.go:1579: failed to parse "* The control-plane node functional-006924 apiserver is not running: (state=Stopped)\n  To start a cluster, run: \"minikube start -p functional-006924\"": parse "* The control-plane node functional-006924 apiserver is not running: (state=Stopped)\n  To start a cluster, run: \"minikube start -p functional-006924\"": net/url: invalid control character in URL
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ServiceCmd/URL (0.30s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MountCmd/any-port (2.65s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MountCmd/any-port
functional_test_mount_test.go:74: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-006924 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun3154741832/001:/mount-9p --alsologtostderr -v=1]
functional_test_mount_test.go:108: wrote "test-1766125579269030938" to /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun3154741832/001/created-by-test
functional_test_mount_test.go:108: wrote "test-1766125579269030938" to /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun3154741832/001/created-by-test-removed-by-pod
functional_test_mount_test.go:108: wrote "test-1766125579269030938" to /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun3154741832/001/test-1766125579269030938
functional_test_mount_test.go:116: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:116: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-006924 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (363.656966ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
I1219 06:26:19.632980 2000386 retry.go:31] will retry after 721.841156ms: exit status 1
functional_test_mount_test.go:116: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:130: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 ssh -- ls -la /mount-9p
functional_test_mount_test.go:134: guest mount directory contents
total 2
-rw-r--r-- 1 docker docker 24 Dec 19 06:26 created-by-test
-rw-r--r-- 1 docker docker 24 Dec 19 06:26 created-by-test-removed-by-pod
-rw-r--r-- 1 docker docker 24 Dec 19 06:26 test-1766125579269030938
functional_test_mount_test.go:138: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 ssh cat /mount-9p/test-1766125579269030938
functional_test_mount_test.go:149: (dbg) Run:  kubectl --context functional-006924 replace --force -f testdata/busybox-mount-test.yaml
functional_test_mount_test.go:149: (dbg) Non-zero exit: kubectl --context functional-006924 replace --force -f testdata/busybox-mount-test.yaml: exit status 1 (59.72938ms)

                                                
                                                
** stderr ** 
	error: error when deleting "testdata/busybox-mount-test.yaml": Delete "https://192.168.49.2:8441/api/v1/namespaces/default/pods/busybox-mount": dial tcp 192.168.49.2:8441: connect: connection refused

                                                
                                                
** /stderr **
functional_test_mount_test.go:151: failed to 'kubectl replace' for busybox-mount-test. args "kubectl --context functional-006924 replace --force -f testdata/busybox-mount-test.yaml" : exit status 1
functional_test_mount_test.go:81: "TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MountCmd/any-port" failed, getting debug info...
functional_test_mount_test.go:82: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 ssh "mount | grep 9p; ls -la /mount-9p; cat /mount-9p/pod-dates"
functional_test_mount_test.go:82: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-006924 ssh "mount | grep 9p; ls -la /mount-9p; cat /mount-9p/pod-dates": exit status 1 (289.234909ms)

                                                
                                                
-- stdout --
	192.168.49.1 on /mount-9p type 9p (rw,relatime,sync,dirsync,dfltuid=1000,dfltgid=997,access=any,msize=262144,trans=tcp,noextend,port=37193)
	total 2
	-rw-r--r-- 1 docker docker 24 Dec 19 06:26 created-by-test
	-rw-r--r-- 1 docker docker 24 Dec 19 06:26 created-by-test-removed-by-pod
	-rw-r--r-- 1 docker docker 24 Dec 19 06:26 test-1766125579269030938
	cat: /mount-9p/pod-dates: No such file or directory

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:84: debugging command "out/minikube-linux-arm64 -p functional-006924 ssh \"mount | grep 9p; ls -la /mount-9p; cat /mount-9p/pod-dates\"" failed : exit status 1
functional_test_mount_test.go:91: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 ssh "sudo umount -f /mount-9p"
functional_test_mount_test.go:95: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-006924 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun3154741832/001:/mount-9p --alsologtostderr -v=1] ...
functional_test_mount_test.go:95: (dbg) [out/minikube-linux-arm64 mount -p functional-006924 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun3154741832/001:/mount-9p --alsologtostderr -v=1] stdout:
* Mounting host path /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun3154741832/001 into VM as /mount-9p ...
- Mount type:   9p
- User ID:      docker
- Group ID:     docker
- Version:      9p2000.L
- Message Size: 262144
- Options:      map[]
- Bind Address: 192.168.49.1:37193
* Userspace file server: 
ufs starting
* Successfully mounted /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun3154741832/001 to /mount-9p

                                                
                                                
* NOTE: This process must stay alive for the mount to be accessible ...
* Unmounting /mount-9p ...

                                                
                                                

                                                
                                                
functional_test_mount_test.go:95: (dbg) [out/minikube-linux-arm64 mount -p functional-006924 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun3154741832/001:/mount-9p --alsologtostderr -v=1] stderr:
I1219 06:26:19.336375 2080169 out.go:360] Setting OutFile to fd 1 ...
I1219 06:26:19.336620 2080169 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1219 06:26:19.336641 2080169 out.go:374] Setting ErrFile to fd 2...
I1219 06:26:19.336656 2080169 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1219 06:26:19.336959 2080169 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22230-1998525/.minikube/bin
I1219 06:26:19.337290 2080169 mustload.go:66] Loading cluster: functional-006924
I1219 06:26:19.337667 2080169 config.go:182] Loaded profile config "functional-006924": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
I1219 06:26:19.338219 2080169 cli_runner.go:164] Run: docker container inspect functional-006924 --format={{.State.Status}}
I1219 06:26:19.357172 2080169 host.go:66] Checking if "functional-006924" exists ...
I1219 06:26:19.357503 2080169 cli_runner.go:164] Run: docker system info --format "{{json .}}"
I1219 06:26:19.472696 2080169 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-19 06:26:19.462612345 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:aa
rch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pa
th:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
I1219 06:26:19.472882 2080169 cli_runner.go:164] Run: docker network inspect functional-006924 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
I1219 06:26:19.494147 2080169 out.go:179] * Mounting host path /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun3154741832/001 into VM as /mount-9p ...
I1219 06:26:19.497298 2080169 out.go:179]   - Mount type:   9p
I1219 06:26:19.500076 2080169 out.go:179]   - User ID:      docker
I1219 06:26:19.502992 2080169 out.go:179]   - Group ID:     docker
I1219 06:26:19.506017 2080169 out.go:179]   - Version:      9p2000.L
I1219 06:26:19.508935 2080169 out.go:179]   - Message Size: 262144
I1219 06:26:19.511874 2080169 out.go:179]   - Options:      map[]
I1219 06:26:19.514730 2080169 out.go:179]   - Bind Address: 192.168.49.1:37193
I1219 06:26:19.517688 2080169 out.go:179] * Userspace file server: 
I1219 06:26:19.518048 2080169 ssh_runner.go:195] Run: /bin/bash -c "[ "x$(findmnt -T /mount-9p | grep /mount-9p)" != "x" ] && sudo umount -f -l /mount-9p || echo "
I1219 06:26:19.518166 2080169 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
I1219 06:26:19.550404 2080169 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
I1219 06:26:19.660444 2080169 mount.go:180] unmount for /mount-9p ran successfully
I1219 06:26:19.660470 2080169 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /mount-9p"
I1219 06:26:19.669332 2080169 ssh_runner.go:195] Run: /bin/bash -c "sudo mount -t 9p -o dfltgid=$(grep ^docker: /etc/group | cut -d: -f3),dfltuid=$(id -u docker),msize=262144,port=37193,trans=tcp,version=9p2000.L 192.168.49.1 /mount-9p"
I1219 06:26:19.679788 2080169 main.go:127] stdlog: ufs.go:141 connected
I1219 06:26:19.679955 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Tversion tag 65535 msize 262144 version '9P2000.L'
I1219 06:26:19.679994 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rversion tag 65535 msize 262144 version '9P2000'
I1219 06:26:19.680239 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Tattach tag 0 fid 0 afid 4294967295 uname 'nobody' nuname 0 aname ''
I1219 06:26:19.680301 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rattach tag 0 aqid (ed779a 3549ac01 'd')
I1219 06:26:19.681113 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Tstat tag 0 fid 0
I1219 06:26:19.681184 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rstat tag 0 st ('001' 'jenkins' 'jenkins' '' q (ed779a 3549ac01 'd') m d775 at 0 mt 1766125579 l 4096 t 0 d 0 ext )
I1219 06:26:19.684695 2080169 lock.go:50] WriteFile acquiring /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/.mount-process: {Name:mkd9e5a66e860aed8a28479b1363dc525a7823fc Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
I1219 06:26:19.684907 2080169 mount.go:105] mount successful: ""
I1219 06:26:19.688202 2080169 out.go:179] * Successfully mounted /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun3154741832/001 to /mount-9p
I1219 06:26:19.691085 2080169 out.go:203] 
I1219 06:26:19.693905 2080169 out.go:179] * NOTE: This process must stay alive for the mount to be accessible ...
I1219 06:26:20.893330 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Tstat tag 0 fid 0
I1219 06:26:20.893411 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rstat tag 0 st ('001' 'jenkins' 'jenkins' '' q (ed779a 3549ac01 'd') m d775 at 0 mt 1766125579 l 4096 t 0 d 0 ext )
I1219 06:26:20.893761 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Twalk tag 0 fid 0 newfid 1 
I1219 06:26:20.893799 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rwalk tag 0 
I1219 06:26:20.893945 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Topen tag 0 fid 1 mode 0
I1219 06:26:20.894001 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Ropen tag 0 qid (ed779a 3549ac01 'd') iounit 0
I1219 06:26:20.894132 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Tstat tag 0 fid 0
I1219 06:26:20.894170 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rstat tag 0 st ('001' 'jenkins' 'jenkins' '' q (ed779a 3549ac01 'd') m d775 at 0 mt 1766125579 l 4096 t 0 d 0 ext )
I1219 06:26:20.894335 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Tread tag 0 fid 1 offset 0 count 262120
I1219 06:26:20.894457 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rread tag 0 count 258
I1219 06:26:20.894599 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Tread tag 0 fid 1 offset 258 count 261862
I1219 06:26:20.894630 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rread tag 0 count 0
I1219 06:26:20.894764 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Tread tag 0 fid 1 offset 258 count 262120
I1219 06:26:20.894788 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rread tag 0 count 0
I1219 06:26:20.894913 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Twalk tag 0 fid 0 newfid 2 0:'created-by-test' 
I1219 06:26:20.894948 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rwalk tag 0 (ed779b 3549ac01 '') 
I1219 06:26:20.895084 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Tstat tag 0 fid 2
I1219 06:26:20.895118 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rstat tag 0 st ('created-by-test' 'jenkins' 'jenkins' '' q (ed779b 3549ac01 '') m 644 at 0 mt 1766125579 l 24 t 0 d 0 ext )
I1219 06:26:20.895247 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Tstat tag 0 fid 2
I1219 06:26:20.895279 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rstat tag 0 st ('created-by-test' 'jenkins' 'jenkins' '' q (ed779b 3549ac01 '') m 644 at 0 mt 1766125579 l 24 t 0 d 0 ext )
I1219 06:26:20.895410 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Tclunk tag 0 fid 2
I1219 06:26:20.895444 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rclunk tag 0
I1219 06:26:20.895574 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Twalk tag 0 fid 0 newfid 2 0:'test-1766125579269030938' 
I1219 06:26:20.895608 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rwalk tag 0 (ed779d 3549ac01 '') 
I1219 06:26:20.895744 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Tstat tag 0 fid 2
I1219 06:26:20.895779 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rstat tag 0 st ('test-1766125579269030938' 'jenkins' 'jenkins' '' q (ed779d 3549ac01 '') m 644 at 0 mt 1766125579 l 24 t 0 d 0 ext )
I1219 06:26:20.895893 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Tstat tag 0 fid 2
I1219 06:26:20.895927 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rstat tag 0 st ('test-1766125579269030938' 'jenkins' 'jenkins' '' q (ed779d 3549ac01 '') m 644 at 0 mt 1766125579 l 24 t 0 d 0 ext )
I1219 06:26:20.896061 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Tclunk tag 0 fid 2
I1219 06:26:20.896082 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rclunk tag 0
I1219 06:26:20.896218 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Twalk tag 0 fid 0 newfid 2 0:'created-by-test-removed-by-pod' 
I1219 06:26:20.896259 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rwalk tag 0 (ed779c 3549ac01 '') 
I1219 06:26:20.896370 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Tstat tag 0 fid 2
I1219 06:26:20.896401 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rstat tag 0 st ('created-by-test-removed-by-pod' 'jenkins' 'jenkins' '' q (ed779c 3549ac01 '') m 644 at 0 mt 1766125579 l 24 t 0 d 0 ext )
I1219 06:26:20.896514 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Tstat tag 0 fid 2
I1219 06:26:20.896544 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rstat tag 0 st ('created-by-test-removed-by-pod' 'jenkins' 'jenkins' '' q (ed779c 3549ac01 '') m 644 at 0 mt 1766125579 l 24 t 0 d 0 ext )
I1219 06:26:20.896678 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Tclunk tag 0 fid 2
I1219 06:26:20.896700 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rclunk tag 0
I1219 06:26:20.896824 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Tread tag 0 fid 1 offset 258 count 262120
I1219 06:26:20.896852 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rread tag 0 count 0
I1219 06:26:20.896978 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Tclunk tag 0 fid 1
I1219 06:26:20.897006 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rclunk tag 0
I1219 06:26:21.165190 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Twalk tag 0 fid 0 newfid 1 0:'test-1766125579269030938' 
I1219 06:26:21.165268 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rwalk tag 0 (ed779d 3549ac01 '') 
I1219 06:26:21.165438 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Tstat tag 0 fid 1
I1219 06:26:21.165484 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rstat tag 0 st ('test-1766125579269030938' 'jenkins' 'jenkins' '' q (ed779d 3549ac01 '') m 644 at 0 mt 1766125579 l 24 t 0 d 0 ext )
I1219 06:26:21.165637 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Twalk tag 0 fid 1 newfid 2 
I1219 06:26:21.165668 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rwalk tag 0 
I1219 06:26:21.165782 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Topen tag 0 fid 2 mode 0
I1219 06:26:21.165834 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Ropen tag 0 qid (ed779d 3549ac01 '') iounit 0
I1219 06:26:21.165989 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Tstat tag 0 fid 1
I1219 06:26:21.166063 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rstat tag 0 st ('test-1766125579269030938' 'jenkins' 'jenkins' '' q (ed779d 3549ac01 '') m 644 at 0 mt 1766125579 l 24 t 0 d 0 ext )
I1219 06:26:21.166218 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Tread tag 0 fid 2 offset 0 count 262120
I1219 06:26:21.166273 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rread tag 0 count 24
I1219 06:26:21.166405 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Tread tag 0 fid 2 offset 24 count 262120
I1219 06:26:21.166445 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rread tag 0 count 0
I1219 06:26:21.166597 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Tread tag 0 fid 2 offset 24 count 262120
I1219 06:26:21.166633 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rread tag 0 count 0
I1219 06:26:21.166802 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Tclunk tag 0 fid 2
I1219 06:26:21.166834 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rclunk tag 0
I1219 06:26:21.167099 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Tclunk tag 0 fid 1
I1219 06:26:21.167133 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rclunk tag 0
I1219 06:26:21.519832 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Tstat tag 0 fid 0
I1219 06:26:21.519909 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rstat tag 0 st ('001' 'jenkins' 'jenkins' '' q (ed779a 3549ac01 'd') m d775 at 0 mt 1766125579 l 4096 t 0 d 0 ext )
I1219 06:26:21.520281 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Twalk tag 0 fid 0 newfid 1 
I1219 06:26:21.520318 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rwalk tag 0 
I1219 06:26:21.520439 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Topen tag 0 fid 1 mode 0
I1219 06:26:21.520495 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Ropen tag 0 qid (ed779a 3549ac01 'd') iounit 0
I1219 06:26:21.520652 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Tstat tag 0 fid 0
I1219 06:26:21.520716 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rstat tag 0 st ('001' 'jenkins' 'jenkins' '' q (ed779a 3549ac01 'd') m d775 at 0 mt 1766125579 l 4096 t 0 d 0 ext )
I1219 06:26:21.520887 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Tread tag 0 fid 1 offset 0 count 262120
I1219 06:26:21.520987 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rread tag 0 count 258
I1219 06:26:21.521124 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Tread tag 0 fid 1 offset 258 count 261862
I1219 06:26:21.521154 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rread tag 0 count 0
I1219 06:26:21.521308 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Tread tag 0 fid 1 offset 258 count 262120
I1219 06:26:21.521339 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rread tag 0 count 0
I1219 06:26:21.521481 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Twalk tag 0 fid 0 newfid 2 0:'created-by-test' 
I1219 06:26:21.521517 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rwalk tag 0 (ed779b 3549ac01 '') 
I1219 06:26:21.521642 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Tstat tag 0 fid 2
I1219 06:26:21.521678 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rstat tag 0 st ('created-by-test' 'jenkins' 'jenkins' '' q (ed779b 3549ac01 '') m 644 at 0 mt 1766125579 l 24 t 0 d 0 ext )
I1219 06:26:21.521801 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Tstat tag 0 fid 2
I1219 06:26:21.521831 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rstat tag 0 st ('created-by-test' 'jenkins' 'jenkins' '' q (ed779b 3549ac01 '') m 644 at 0 mt 1766125579 l 24 t 0 d 0 ext )
I1219 06:26:21.521962 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Tclunk tag 0 fid 2
I1219 06:26:21.521998 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rclunk tag 0
I1219 06:26:21.522131 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Twalk tag 0 fid 0 newfid 2 0:'test-1766125579269030938' 
I1219 06:26:21.522197 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rwalk tag 0 (ed779d 3549ac01 '') 
I1219 06:26:21.522342 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Tstat tag 0 fid 2
I1219 06:26:21.522400 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rstat tag 0 st ('test-1766125579269030938' 'jenkins' 'jenkins' '' q (ed779d 3549ac01 '') m 644 at 0 mt 1766125579 l 24 t 0 d 0 ext )
I1219 06:26:21.522520 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Tstat tag 0 fid 2
I1219 06:26:21.522562 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rstat tag 0 st ('test-1766125579269030938' 'jenkins' 'jenkins' '' q (ed779d 3549ac01 '') m 644 at 0 mt 1766125579 l 24 t 0 d 0 ext )
I1219 06:26:21.522696 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Tclunk tag 0 fid 2
I1219 06:26:21.522750 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rclunk tag 0
I1219 06:26:21.522905 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Twalk tag 0 fid 0 newfid 2 0:'created-by-test-removed-by-pod' 
I1219 06:26:21.522942 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rwalk tag 0 (ed779c 3549ac01 '') 
I1219 06:26:21.523063 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Tstat tag 0 fid 2
I1219 06:26:21.523105 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rstat tag 0 st ('created-by-test-removed-by-pod' 'jenkins' 'jenkins' '' q (ed779c 3549ac01 '') m 644 at 0 mt 1766125579 l 24 t 0 d 0 ext )
I1219 06:26:21.523241 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Tstat tag 0 fid 2
I1219 06:26:21.523298 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rstat tag 0 st ('created-by-test-removed-by-pod' 'jenkins' 'jenkins' '' q (ed779c 3549ac01 '') m 644 at 0 mt 1766125579 l 24 t 0 d 0 ext )
I1219 06:26:21.523410 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Tclunk tag 0 fid 2
I1219 06:26:21.523438 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rclunk tag 0
I1219 06:26:21.523567 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Tread tag 0 fid 1 offset 258 count 262120
I1219 06:26:21.523600 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rread tag 0 count 0
I1219 06:26:21.523743 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Tclunk tag 0 fid 1
I1219 06:26:21.523776 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rclunk tag 0
I1219 06:26:21.525045 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Twalk tag 0 fid 0 newfid 1 0:'pod-dates' 
I1219 06:26:21.525120 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rerror tag 0 ename 'file not found' ecode 0
I1219 06:26:21.795227 2080169 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:41982 Tclunk tag 0 fid 0
I1219 06:26:21.795280 2080169 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:41982 Rclunk tag 0
I1219 06:26:21.796359 2080169 main.go:127] stdlog: ufs.go:147 disconnected
I1219 06:26:21.818462 2080169 out.go:179] * Unmounting /mount-9p ...
I1219 06:26:21.821375 2080169 ssh_runner.go:195] Run: /bin/bash -c "[ "x$(findmnt -T /mount-9p | grep /mount-9p)" != "x" ] && sudo umount -f -l /mount-9p || echo "
I1219 06:26:21.828956 2080169 mount.go:180] unmount for /mount-9p ran successfully
I1219 06:26:21.829069 2080169 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/.mount-process: {Name:mkd9e5a66e860aed8a28479b1363dc525a7823fc Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
I1219 06:26:21.832169 2080169 out.go:203] 
W1219 06:26:21.835204 2080169 out.go:285] X Exiting due to MK_INTERRUPTED: Received terminated signal
X Exiting due to MK_INTERRUPTED: Received terminated signal
I1219 06:26:21.838055 2080169 out.go:203] 
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MountCmd/any-port (2.65s)

                                                
                                    
x
+
TestKubernetesUpgrade (800.57s)

                                                
                                                
=== RUN   TestKubernetesUpgrade
=== PAUSE TestKubernetesUpgrade

                                                
                                                

                                                
                                                
=== CONT  TestKubernetesUpgrade
version_upgrade_test.go:222: (dbg) Run:  out/minikube-linux-arm64 start -p kubernetes-upgrade-352421 --memory=3072 --kubernetes-version=v1.28.0 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd
E1219 06:54:34.246833 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
version_upgrade_test.go:222: (dbg) Done: out/minikube-linux-arm64 start -p kubernetes-upgrade-352421 --memory=3072 --kubernetes-version=v1.28.0 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd: (37.112612643s)
version_upgrade_test.go:227: (dbg) Run:  out/minikube-linux-arm64 stop -p kubernetes-upgrade-352421
version_upgrade_test.go:227: (dbg) Done: out/minikube-linux-arm64 stop -p kubernetes-upgrade-352421: (1.572057219s)
version_upgrade_test.go:232: (dbg) Run:  out/minikube-linux-arm64 -p kubernetes-upgrade-352421 status --format={{.Host}}
version_upgrade_test.go:232: (dbg) Non-zero exit: out/minikube-linux-arm64 -p kubernetes-upgrade-352421 status --format={{.Host}}: exit status 7 (72.820644ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
version_upgrade_test.go:234: status error: exit status 7 (may be ok)
version_upgrade_test.go:243: (dbg) Run:  out/minikube-linux-arm64 start -p kubernetes-upgrade-352421 --memory=3072 --kubernetes-version=v1.35.0-rc.1 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd
version_upgrade_test.go:243: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p kubernetes-upgrade-352421 --memory=3072 --kubernetes-version=v1.35.0-rc.1 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd: exit status 109 (12m36.264491151s)

                                                
                                                
-- stdout --
	* [kubernetes-upgrade-352421] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=22230
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/22230-1998525/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/22230-1998525/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the docker driver based on existing profile
	* Starting "kubernetes-upgrade-352421" primary control-plane node in "kubernetes-upgrade-352421" cluster
	* Pulling base image v0.0.48-1765966054-22186 ...
	* Preparing Kubernetes v1.35.0-rc.1 on containerd 2.2.0 ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1219 06:54:53.022451 2212400 out.go:360] Setting OutFile to fd 1 ...
	I1219 06:54:53.022577 2212400 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 06:54:53.022588 2212400 out.go:374] Setting ErrFile to fd 2...
	I1219 06:54:53.022593 2212400 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 06:54:53.022848 2212400 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22230-1998525/.minikube/bin
	I1219 06:54:53.023196 2212400 out.go:368] Setting JSON to false
	I1219 06:54:53.024133 2212400 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":41839,"bootTime":1766085454,"procs":196,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1219 06:54:53.024200 2212400 start.go:143] virtualization:  
	I1219 06:54:53.029390 2212400 out.go:179] * [kubernetes-upgrade-352421] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1219 06:54:53.032622 2212400 out.go:179]   - MINIKUBE_LOCATION=22230
	I1219 06:54:53.032668 2212400 notify.go:221] Checking for updates...
	I1219 06:54:53.038633 2212400 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1219 06:54:53.041881 2212400 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22230-1998525/kubeconfig
	I1219 06:54:53.044839 2212400 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22230-1998525/.minikube
	I1219 06:54:53.047918 2212400 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1219 06:54:53.050938 2212400 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1219 06:54:53.054467 2212400 config.go:182] Loaded profile config "kubernetes-upgrade-352421": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.28.0
	I1219 06:54:53.055247 2212400 driver.go:422] Setting default libvirt URI to qemu:///system
	I1219 06:54:53.087085 2212400 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1219 06:54:53.087189 2212400 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1219 06:54:53.172703 2212400 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:2 ContainersRunning:1 ContainersPaused:0 ContainersStopped:1 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:39 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-19 06:54:53.161256669 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1219 06:54:53.172853 2212400 docker.go:319] overlay module found
	I1219 06:54:53.178176 2212400 out.go:179] * Using the docker driver based on existing profile
	I1219 06:54:53.181014 2212400 start.go:309] selected driver: docker
	I1219 06:54:53.181041 2212400 start.go:928] validating driver "docker" against &{Name:kubernetes-upgrade-352421 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.28.0 ClusterName:kubernetes-upgrade-352421 Namespace:default APIServerHA
VIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.76.2 Port:8443 KubernetesVersion:v1.28.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false Cu
stomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1219 06:54:53.181144 2212400 start.go:939] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1219 06:54:53.181837 2212400 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1219 06:54:53.269398 2212400 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:2 ContainersRunning:1 ContainersPaused:0 ContainersStopped:1 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:39 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-19 06:54:53.259692274 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1219 06:54:53.269729 2212400 cni.go:84] Creating CNI manager for ""
	I1219 06:54:53.269798 2212400 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1219 06:54:53.269849 2212400 start.go:353] cluster config:
	{Name:kubernetes-upgrade-352421 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:kubernetes-upgrade-352421 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:c
luster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.76.2 Port:8443 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP
: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1219 06:54:53.276747 2212400 out.go:179] * Starting "kubernetes-upgrade-352421" primary control-plane node in "kubernetes-upgrade-352421" cluster
	I1219 06:54:53.279739 2212400 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1219 06:54:53.282876 2212400 out.go:179] * Pulling base image v0.0.48-1765966054-22186 ...
	I1219 06:54:53.285965 2212400 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon
	I1219 06:54:53.286552 2212400 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1219 06:54:53.286592 2212400 preload.go:203] Found local preload: /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4
	I1219 06:54:53.286611 2212400 cache.go:65] Caching tarball of preloaded images
	I1219 06:54:53.286686 2212400 preload.go:238] Found /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4 in cache, skipping download
	I1219 06:54:53.286701 2212400 cache.go:68] Finished verifying existence of preloaded tar for v1.35.0-rc.1 on containerd
	I1219 06:54:53.286808 2212400 profile.go:143] Saving config to /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/kubernetes-upgrade-352421/config.json ...
	I1219 06:54:53.305207 2212400 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon, skipping pull
	I1219 06:54:53.305226 2212400 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 exists in daemon, skipping load
	I1219 06:54:53.305247 2212400 cache.go:243] Successfully downloaded all kic artifacts
	I1219 06:54:53.305276 2212400 start.go:360] acquireMachinesLock for kubernetes-upgrade-352421: {Name:mk79587ab2ca210472b684295d545ecd9f46e63e Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1219 06:54:53.305327 2212400 start.go:364] duration metric: took 33.437µs to acquireMachinesLock for "kubernetes-upgrade-352421"
	I1219 06:54:53.305347 2212400 start.go:96] Skipping create...Using existing machine configuration
	I1219 06:54:53.305352 2212400 fix.go:54] fixHost starting: 
	I1219 06:54:53.305613 2212400 cli_runner.go:164] Run: docker container inspect kubernetes-upgrade-352421 --format={{.State.Status}}
	I1219 06:54:53.324794 2212400 fix.go:112] recreateIfNeeded on kubernetes-upgrade-352421: state=Stopped err=<nil>
	W1219 06:54:53.324823 2212400 fix.go:138] unexpected machine state, will restart: <nil>
	I1219 06:54:53.328359 2212400 out.go:252] * Restarting existing docker container for "kubernetes-upgrade-352421" ...
	I1219 06:54:53.328458 2212400 cli_runner.go:164] Run: docker start kubernetes-upgrade-352421
	I1219 06:54:53.662673 2212400 cli_runner.go:164] Run: docker container inspect kubernetes-upgrade-352421 --format={{.State.Status}}
	I1219 06:54:53.681788 2212400 kic.go:430] container "kubernetes-upgrade-352421" state is running.
	I1219 06:54:53.682442 2212400 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" kubernetes-upgrade-352421
	I1219 06:54:53.710382 2212400 profile.go:143] Saving config to /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/kubernetes-upgrade-352421/config.json ...
	I1219 06:54:53.710613 2212400 machine.go:94] provisionDockerMachine start ...
	I1219 06:54:53.710677 2212400 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-352421
	I1219 06:54:53.742946 2212400 main.go:144] libmachine: Using SSH client type: native
	I1219 06:54:53.743275 2212400 main.go:144] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db5e0] 0x3ddae0 <nil>  [] 0s} 127.0.0.1 34929 <nil> <nil>}
	I1219 06:54:53.743289 2212400 main.go:144] libmachine: About to run SSH command:
	hostname
	I1219 06:54:53.744910 2212400 main.go:144] libmachine: Error dialing TCP: ssh: handshake failed: read tcp 127.0.0.1:36494->127.0.0.1:34929: read: connection reset by peer
	I1219 06:54:56.921208 2212400 main.go:144] libmachine: SSH cmd err, output: <nil>: kubernetes-upgrade-352421
	
	I1219 06:54:56.921231 2212400 ubuntu.go:182] provisioning hostname "kubernetes-upgrade-352421"
	I1219 06:54:56.921296 2212400 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-352421
	I1219 06:54:56.945250 2212400 main.go:144] libmachine: Using SSH client type: native
	I1219 06:54:56.945568 2212400 main.go:144] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db5e0] 0x3ddae0 <nil>  [] 0s} 127.0.0.1 34929 <nil> <nil>}
	I1219 06:54:56.945583 2212400 main.go:144] libmachine: About to run SSH command:
	sudo hostname kubernetes-upgrade-352421 && echo "kubernetes-upgrade-352421" | sudo tee /etc/hostname
	I1219 06:54:57.128879 2212400 main.go:144] libmachine: SSH cmd err, output: <nil>: kubernetes-upgrade-352421
	
	I1219 06:54:57.128958 2212400 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-352421
	I1219 06:54:57.151149 2212400 main.go:144] libmachine: Using SSH client type: native
	I1219 06:54:57.151462 2212400 main.go:144] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db5e0] 0x3ddae0 <nil>  [] 0s} 127.0.0.1 34929 <nil> <nil>}
	I1219 06:54:57.151479 2212400 main.go:144] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\skubernetes-upgrade-352421' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 kubernetes-upgrade-352421/g' /etc/hosts;
				else 
					echo '127.0.1.1 kubernetes-upgrade-352421' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1219 06:54:57.315878 2212400 main.go:144] libmachine: SSH cmd err, output: <nil>: 
	I1219 06:54:57.315910 2212400 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/22230-1998525/.minikube CaCertPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/22230-1998525/.minikube}
	I1219 06:54:57.315947 2212400 ubuntu.go:190] setting up certificates
	I1219 06:54:57.315958 2212400 provision.go:84] configureAuth start
	I1219 06:54:57.316021 2212400 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" kubernetes-upgrade-352421
	I1219 06:54:57.333979 2212400 provision.go:143] copyHostCerts
	I1219 06:54:57.334061 2212400 exec_runner.go:144] found /home/jenkins/minikube-integration/22230-1998525/.minikube/key.pem, removing ...
	I1219 06:54:57.334079 2212400 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22230-1998525/.minikube/key.pem
	I1219 06:54:57.334169 2212400 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/22230-1998525/.minikube/key.pem (1671 bytes)
	I1219 06:54:57.334292 2212400 exec_runner.go:144] found /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.pem, removing ...
	I1219 06:54:57.334304 2212400 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.pem
	I1219 06:54:57.334335 2212400 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.pem (1078 bytes)
	I1219 06:54:57.334399 2212400 exec_runner.go:144] found /home/jenkins/minikube-integration/22230-1998525/.minikube/cert.pem, removing ...
	I1219 06:54:57.334409 2212400 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22230-1998525/.minikube/cert.pem
	I1219 06:54:57.334434 2212400 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/22230-1998525/.minikube/cert.pem (1123 bytes)
	I1219 06:54:57.334486 2212400 provision.go:117] generating server cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca-key.pem org=jenkins.kubernetes-upgrade-352421 san=[127.0.0.1 192.168.76.2 kubernetes-upgrade-352421 localhost minikube]
	I1219 06:54:57.564038 2212400 provision.go:177] copyRemoteCerts
	I1219 06:54:57.564107 2212400 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1219 06:54:57.564164 2212400 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-352421
	I1219 06:54:57.582708 2212400 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34929 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/kubernetes-upgrade-352421/id_rsa Username:docker}
	I1219 06:54:57.688494 2212400 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1219 06:54:57.706975 2212400 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server.pem --> /etc/docker/server.pem (1241 bytes)
	I1219 06:54:57.725400 2212400 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I1219 06:54:57.742727 2212400 provision.go:87] duration metric: took 426.724833ms to configureAuth
	I1219 06:54:57.742762 2212400 ubuntu.go:206] setting minikube options for container-runtime
	I1219 06:54:57.742932 2212400 config.go:182] Loaded profile config "kubernetes-upgrade-352421": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1219 06:54:57.742944 2212400 machine.go:97] duration metric: took 4.032324351s to provisionDockerMachine
	I1219 06:54:57.742952 2212400 start.go:293] postStartSetup for "kubernetes-upgrade-352421" (driver="docker")
	I1219 06:54:57.742963 2212400 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1219 06:54:57.743023 2212400 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1219 06:54:57.743070 2212400 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-352421
	I1219 06:54:57.762017 2212400 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34929 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/kubernetes-upgrade-352421/id_rsa Username:docker}
	I1219 06:54:57.874777 2212400 ssh_runner.go:195] Run: cat /etc/os-release
	I1219 06:54:57.878707 2212400 main.go:144] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1219 06:54:57.878729 2212400 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1219 06:54:57.878740 2212400 filesync.go:126] Scanning /home/jenkins/minikube-integration/22230-1998525/.minikube/addons for local assets ...
	I1219 06:54:57.878795 2212400 filesync.go:126] Scanning /home/jenkins/minikube-integration/22230-1998525/.minikube/files for local assets ...
	I1219 06:54:57.878874 2212400 filesync.go:149] local asset: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem -> 20003862.pem in /etc/ssl/certs
	I1219 06:54:57.878974 2212400 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I1219 06:54:57.887743 2212400 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem --> /etc/ssl/certs/20003862.pem (1708 bytes)
	I1219 06:54:57.918796 2212400 start.go:296] duration metric: took 175.827865ms for postStartSetup
	I1219 06:54:57.918887 2212400 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1219 06:54:57.918980 2212400 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-352421
	I1219 06:54:57.949194 2212400 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34929 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/kubernetes-upgrade-352421/id_rsa Username:docker}
	I1219 06:54:58.075077 2212400 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1219 06:54:58.086064 2212400 fix.go:56] duration metric: took 4.780705728s for fixHost
	I1219 06:54:58.086165 2212400 start.go:83] releasing machines lock for "kubernetes-upgrade-352421", held for 4.780817541s
	I1219 06:54:58.086258 2212400 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" kubernetes-upgrade-352421
	I1219 06:54:58.114071 2212400 ssh_runner.go:195] Run: cat /version.json
	I1219 06:54:58.114184 2212400 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-352421
	I1219 06:54:58.114619 2212400 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1219 06:54:58.114673 2212400 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-352421
	I1219 06:54:58.147668 2212400 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34929 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/kubernetes-upgrade-352421/id_rsa Username:docker}
	I1219 06:54:58.168341 2212400 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34929 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/kubernetes-upgrade-352421/id_rsa Username:docker}
	I1219 06:54:58.402612 2212400 ssh_runner.go:195] Run: systemctl --version
	I1219 06:54:58.409943 2212400 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1219 06:54:58.415624 2212400 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1219 06:54:58.415707 2212400 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1219 06:54:58.428691 2212400 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1219 06:54:58.428716 2212400 start.go:496] detecting cgroup driver to use...
	I1219 06:54:58.428749 2212400 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1219 06:54:58.428923 2212400 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1219 06:54:58.448542 2212400 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1219 06:54:58.465900 2212400 docker.go:218] disabling cri-docker service (if available) ...
	I1219 06:54:58.465987 2212400 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1219 06:54:58.482623 2212400 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1219 06:54:58.496936 2212400 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1219 06:54:58.641811 2212400 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1219 06:54:58.790993 2212400 docker.go:234] disabling docker service ...
	I1219 06:54:58.791063 2212400 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1219 06:54:58.809751 2212400 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1219 06:54:58.828532 2212400 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1219 06:54:59.010010 2212400 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1219 06:54:59.144828 2212400 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1219 06:54:59.160931 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1219 06:54:59.179762 2212400 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1219 06:54:59.192291 2212400 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1219 06:54:59.202512 2212400 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1219 06:54:59.202583 2212400 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1219 06:54:59.217112 2212400 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1219 06:54:59.227662 2212400 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1219 06:54:59.238119 2212400 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1219 06:54:59.247468 2212400 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1219 06:54:59.267072 2212400 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1219 06:54:59.281935 2212400 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1219 06:54:59.294625 2212400 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1219 06:54:59.304044 2212400 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1219 06:54:59.321283 2212400 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1219 06:54:59.328825 2212400 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1219 06:54:59.493908 2212400 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1219 06:54:59.712006 2212400 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1219 06:54:59.712123 2212400 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1219 06:54:59.718135 2212400 start.go:564] Will wait 60s for crictl version
	I1219 06:54:59.718256 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:54:59.722226 2212400 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1219 06:54:59.751206 2212400 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.2.0
	RuntimeApiVersion:  v1
	I1219 06:54:59.751284 2212400 ssh_runner.go:195] Run: containerd --version
	I1219 06:54:59.782771 2212400 ssh_runner.go:195] Run: containerd --version
	I1219 06:54:59.812791 2212400 out.go:179] * Preparing Kubernetes v1.35.0-rc.1 on containerd 2.2.0 ...
	I1219 06:54:59.815979 2212400 cli_runner.go:164] Run: docker network inspect kubernetes-upgrade-352421 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1219 06:54:59.839648 2212400 ssh_runner.go:195] Run: grep 192.168.76.1	host.minikube.internal$ /etc/hosts
	I1219 06:54:59.845443 2212400 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.76.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I1219 06:54:59.858930 2212400 kubeadm.go:884] updating cluster {Name:kubernetes-upgrade-352421 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:kubernetes-upgrade-352421 Namespace:default APIServerHAVIP: APIServ
erName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.76.2 Port:8443 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQem
uFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1219 06:54:59.859075 2212400 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
	I1219 06:54:59.859149 2212400 ssh_runner.go:195] Run: sudo crictl images --output json
	I1219 06:54:59.888690 2212400 containerd.go:623] couldn't find preloaded image for "registry.k8s.io/kube-apiserver:v1.35.0-rc.1". assuming images are not preloaded.
	I1219 06:54:59.888780 2212400 ssh_runner.go:195] Run: which lz4
	I1219 06:54:59.892551 2212400 ssh_runner.go:195] Run: stat -c "%s %y" /preloaded.tar.lz4
	I1219 06:54:59.895982 2212400 ssh_runner.go:352] existence check for /preloaded.tar.lz4: stat -c "%s %y" /preloaded.tar.lz4: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/preloaded.tar.lz4': No such file or directory
	I1219 06:54:59.896020 2212400 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4 --> /preloaded.tar.lz4 (305659384 bytes)
	I1219 06:55:03.450983 2212400 containerd.go:563] duration metric: took 3.558475422s to copy over tarball
	I1219 06:55:03.451063 2212400 ssh_runner.go:195] Run: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4
	I1219 06:55:05.575266 2212400 ssh_runner.go:235] Completed: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4: (2.124173345s)
	I1219 06:55:05.575341 2212400 kubeadm.go:910] preload failed, will try to load cached images: extracting tarball: 
	** stderr ** 
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Europe: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Brazil: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Canada: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Antarctica: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Chile: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Etc: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Pacific: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Mexico: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Australia: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/US: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Asia: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Atlantic: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/America: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Arctic: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Africa: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Indian: Cannot open: File exists
	tar: Exiting with failure status due to previous errors
	
	** /stderr **: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4: Process exited with status 2
	stdout:
	
	stderr:
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Europe: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Brazil: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Canada: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Antarctica: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Chile: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Etc: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Pacific: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Mexico: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Australia: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/US: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Asia: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Atlantic: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/America: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Arctic: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Africa: Cannot open: File exists
	tar: ./lib/containerd/io.containerd.snapshotter.v1.overlayfs/snapshots/3/fs/usr/share/zoneinfo/posix/Indian: Cannot open: File exists
	tar: Exiting with failure status due to previous errors
	I1219 06:55:05.575430 2212400 ssh_runner.go:195] Run: sudo crictl images --output json
	I1219 06:55:05.620257 2212400 containerd.go:623] couldn't find preloaded image for "registry.k8s.io/kube-apiserver:v1.35.0-rc.1". assuming images are not preloaded.
	I1219 06:55:05.620326 2212400 cache_images.go:90] LoadCachedImages start: [registry.k8s.io/kube-apiserver:v1.35.0-rc.1 registry.k8s.io/kube-controller-manager:v1.35.0-rc.1 registry.k8s.io/kube-scheduler:v1.35.0-rc.1 registry.k8s.io/kube-proxy:v1.35.0-rc.1 registry.k8s.io/pause:3.10.1 registry.k8s.io/etcd:3.6.6-0 registry.k8s.io/coredns/coredns:v1.13.1 gcr.io/k8s-minikube/storage-provisioner:v5]
	I1219 06:55:05.620424 2212400 image.go:138] retrieving image: gcr.io/k8s-minikube/storage-provisioner:v5
	I1219 06:55:05.620657 2212400 image.go:138] retrieving image: registry.k8s.io/kube-apiserver:v1.35.0-rc.1
	I1219 06:55:05.620829 2212400 image.go:138] retrieving image: registry.k8s.io/kube-controller-manager:v1.35.0-rc.1
	I1219 06:55:05.620866 2212400 image.go:138] retrieving image: registry.k8s.io/kube-scheduler:v1.35.0-rc.1
	I1219 06:55:05.621567 2212400 image.go:138] retrieving image: registry.k8s.io/kube-proxy:v1.35.0-rc.1
	I1219 06:55:05.621936 2212400 image.go:138] retrieving image: registry.k8s.io/etcd:3.6.6-0
	I1219 06:55:05.622505 2212400 image.go:138] retrieving image: registry.k8s.io/coredns/coredns:v1.13.1
	I1219 06:55:05.621939 2212400 image.go:138] retrieving image: registry.k8s.io/pause:3.10.1
	I1219 06:55:05.623585 2212400 image.go:181] daemon lookup for registry.k8s.io/kube-apiserver:v1.35.0-rc.1: Error response from daemon: No such image: registry.k8s.io/kube-apiserver:v1.35.0-rc.1
	I1219 06:55:05.624281 2212400 image.go:181] daemon lookup for registry.k8s.io/kube-scheduler:v1.35.0-rc.1: Error response from daemon: No such image: registry.k8s.io/kube-scheduler:v1.35.0-rc.1
	I1219 06:55:05.624359 2212400 image.go:181] daemon lookup for registry.k8s.io/kube-proxy:v1.35.0-rc.1: Error response from daemon: No such image: registry.k8s.io/kube-proxy:v1.35.0-rc.1
	I1219 06:55:05.624576 2212400 image.go:181] daemon lookup for registry.k8s.io/kube-controller-manager:v1.35.0-rc.1: Error response from daemon: No such image: registry.k8s.io/kube-controller-manager:v1.35.0-rc.1
	I1219 06:55:05.624641 2212400 image.go:181] daemon lookup for gcr.io/k8s-minikube/storage-provisioner:v5: Error response from daemon: No such image: gcr.io/k8s-minikube/storage-provisioner:v5
	I1219 06:55:05.625530 2212400 image.go:181] daemon lookup for registry.k8s.io/etcd:3.6.6-0: Error response from daemon: No such image: registry.k8s.io/etcd:3.6.6-0
	I1219 06:55:05.634024 2212400 image.go:181] daemon lookup for registry.k8s.io/coredns/coredns:v1.13.1: Error response from daemon: No such image: registry.k8s.io/coredns/coredns:v1.13.1
	I1219 06:55:05.634128 2212400 image.go:181] daemon lookup for registry.k8s.io/pause:3.10.1: Error response from daemon: No such image: registry.k8s.io/pause:3.10.1
	I1219 06:55:05.955335 2212400 containerd.go:267] Checking existence of image with name "registry.k8s.io/etcd:3.6.6-0" and sha "271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57"
	I1219 06:55:05.955458 2212400 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/etcd:3.6.6-0
	I1219 06:55:05.973582 2212400 containerd.go:267] Checking existence of image with name "registry.k8s.io/coredns/coredns:v1.13.1" and sha "e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf"
	I1219 06:55:05.973683 2212400 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/coredns/coredns:v1.13.1
	I1219 06:55:05.974844 2212400 containerd.go:267] Checking existence of image with name "registry.k8s.io/kube-scheduler:v1.35.0-rc.1" and sha "abca4d5226620be2218c3971464a1066651a743008c1db8720353446a4b7bbde"
	I1219 06:55:05.974911 2212400 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/kube-scheduler:v1.35.0-rc.1
	I1219 06:55:05.979357 2212400 containerd.go:267] Checking existence of image with name "registry.k8s.io/kube-proxy:v1.35.0-rc.1" and sha "7e3acea3d87aa7ca234514e7f9c10450c7a7f87fc273fc9b5a220e2a2be1ce4e"
	I1219 06:55:05.981764 2212400 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/kube-proxy:v1.35.0-rc.1
	I1219 06:55:05.997066 2212400 containerd.go:267] Checking existence of image with name "registry.k8s.io/kube-controller-manager:v1.35.0-rc.1" and sha "a34b3483f25ba81aa72f3aeb607a8c756479e8497d8420acbcd2854162ebf84a"
	I1219 06:55:05.997163 2212400 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/kube-controller-manager:v1.35.0-rc.1
	I1219 06:55:06.014607 2212400 containerd.go:267] Checking existence of image with name "registry.k8s.io/pause:3.10.1" and sha "d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd"
	I1219 06:55:06.014696 2212400 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/pause:3.10.1
	I1219 06:55:06.037938 2212400 containerd.go:267] Checking existence of image with name "registry.k8s.io/kube-apiserver:v1.35.0-rc.1" and sha "3c6ba27e07aef16adb050828695bfe6206439147b9ade2a2a1777c276bf79a54"
	I1219 06:55:06.038019 2212400 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/kube-apiserver:v1.35.0-rc.1
	I1219 06:55:06.131982 2212400 cache_images.go:118] "registry.k8s.io/etcd:3.6.6-0" needs transfer: "registry.k8s.io/etcd:3.6.6-0" does not exist at hash "271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57" in container runtime
	I1219 06:55:06.132089 2212400 cri.go:221] Removing image: registry.k8s.io/etcd:3.6.6-0
	I1219 06:55:06.132172 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:55:06.155263 2212400 cache_images.go:118] "registry.k8s.io/coredns/coredns:v1.13.1" needs transfer: "registry.k8s.io/coredns/coredns:v1.13.1" does not exist at hash "e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf" in container runtime
	I1219 06:55:06.155369 2212400 cri.go:221] Removing image: registry.k8s.io/coredns/coredns:v1.13.1
	I1219 06:55:06.155449 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:55:06.165854 2212400 cache_images.go:118] "registry.k8s.io/kube-scheduler:v1.35.0-rc.1" needs transfer: "registry.k8s.io/kube-scheduler:v1.35.0-rc.1" does not exist at hash "abca4d5226620be2218c3971464a1066651a743008c1db8720353446a4b7bbde" in container runtime
	I1219 06:55:06.165902 2212400 cri.go:221] Removing image: registry.k8s.io/kube-scheduler:v1.35.0-rc.1
	I1219 06:55:06.165941 2212400 cache_images.go:118] "registry.k8s.io/kube-proxy:v1.35.0-rc.1" needs transfer: "registry.k8s.io/kube-proxy:v1.35.0-rc.1" does not exist at hash "7e3acea3d87aa7ca234514e7f9c10450c7a7f87fc273fc9b5a220e2a2be1ce4e" in container runtime
	I1219 06:55:06.165981 2212400 cri.go:221] Removing image: registry.k8s.io/kube-proxy:v1.35.0-rc.1
	I1219 06:55:06.165956 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:55:06.166046 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:55:06.173602 2212400 cache_images.go:118] "registry.k8s.io/kube-controller-manager:v1.35.0-rc.1" needs transfer: "registry.k8s.io/kube-controller-manager:v1.35.0-rc.1" does not exist at hash "a34b3483f25ba81aa72f3aeb607a8c756479e8497d8420acbcd2854162ebf84a" in container runtime
	I1219 06:55:06.173657 2212400 cri.go:221] Removing image: registry.k8s.io/kube-controller-manager:v1.35.0-rc.1
	I1219 06:55:06.173713 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:55:06.173788 2212400 cache_images.go:118] "registry.k8s.io/pause:3.10.1" needs transfer: "registry.k8s.io/pause:3.10.1" does not exist at hash "d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd" in container runtime
	I1219 06:55:06.173809 2212400 cri.go:221] Removing image: registry.k8s.io/pause:3.10.1
	I1219 06:55:06.173830 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:55:06.173893 2212400 cache_images.go:118] "registry.k8s.io/kube-apiserver:v1.35.0-rc.1" needs transfer: "registry.k8s.io/kube-apiserver:v1.35.0-rc.1" does not exist at hash "3c6ba27e07aef16adb050828695bfe6206439147b9ade2a2a1777c276bf79a54" in container runtime
	I1219 06:55:06.173908 2212400 cri.go:221] Removing image: registry.k8s.io/kube-apiserver:v1.35.0-rc.1
	I1219 06:55:06.173934 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:55:06.177686 2212400 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/etcd:3.6.6-0
	I1219 06:55:06.177786 2212400 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/coredns/coredns:v1.13.1
	I1219 06:55:06.180988 2212400 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-proxy:v1.35.0-rc.1
	I1219 06:55:06.181062 2212400 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-scheduler:v1.35.0-rc.1
	I1219 06:55:06.184602 2212400 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-apiserver:v1.35.0-rc.1
	I1219 06:55:06.184694 2212400 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-controller-manager:v1.35.0-rc.1
	I1219 06:55:06.184746 2212400 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/pause:3.10.1
	I1219 06:55:06.290995 2212400 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/coredns/coredns:v1.13.1
	I1219 06:55:06.291082 2212400 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/etcd:3.6.6-0
	I1219 06:55:06.291152 2212400 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-scheduler:v1.35.0-rc.1
	I1219 06:55:06.291209 2212400 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-proxy:v1.35.0-rc.1
	I1219 06:55:06.301898 2212400 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-apiserver:v1.35.0-rc.1
	I1219 06:55:06.301989 2212400 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-controller-manager:v1.35.0-rc.1
	I1219 06:55:06.307211 2212400 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/pause:3.10.1
	I1219 06:55:06.400622 2212400 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/etcd:3.6.6-0
	I1219 06:55:06.400829 2212400 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-scheduler:v1.35.0-rc.1
	I1219 06:55:06.400931 2212400 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/coredns/coredns:v1.13.1
	I1219 06:55:06.400990 2212400 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-proxy:v1.35.0-rc.1
	I1219 06:55:06.421623 2212400 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-apiserver:v1.35.0-rc.1
	I1219 06:55:06.421746 2212400 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/pause:3.10.1
	I1219 06:55:06.421831 2212400 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-controller-manager:v1.35.0-rc.1
	I1219 06:55:06.515729 2212400 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-rc.1
	I1219 06:55:06.515807 2212400 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1
	I1219 06:55:06.515850 2212400 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-rc.1
	I1219 06:55:06.515884 2212400 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.6-0
	I1219 06:55:06.517633 2212400 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1
	I1219 06:55:06.517744 2212400 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/pause_3.10.1
	I1219 06:55:06.517827 2212400 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-rc.1
	I1219 06:55:06.517873 2212400 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-rc.1
	I1219 06:55:06.521640 2212400 ssh_runner.go:352] existence check for /var/lib/minikube/images/pause_3.10.1: stat -c "%s %y" /var/lib/minikube/images/pause_3.10.1: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/pause_3.10.1': No such file or directory
	I1219 06:55:06.521726 2212400 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 --> /var/lib/minikube/images/pause_3.10.1 (268288 bytes)
	I1219 06:55:06.548683 2212400 containerd.go:285] Loading image: /var/lib/minikube/images/pause_3.10.1
	I1219 06:55:06.548795 2212400 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/pause_3.10.1
	W1219 06:55:06.822679 2212400 image.go:328] image gcr.io/k8s-minikube/storage-provisioner:v5 arch mismatch: want arm64 got amd64. fixing
	I1219 06:55:06.822854 2212400 containerd.go:267] Checking existence of image with name "gcr.io/k8s-minikube/storage-provisioner:v5" and sha "66749159455b3f08c8318fe0233122f54d0f5889f9c5fdfb73c3fd9d99895b51"
	I1219 06:55:06.822923 2212400 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==gcr.io/k8s-minikube/storage-provisioner:v5
	I1219 06:55:07.083423 2212400 cache_images.go:118] "gcr.io/k8s-minikube/storage-provisioner:v5" needs transfer: "gcr.io/k8s-minikube/storage-provisioner:v5" does not exist at hash "66749159455b3f08c8318fe0233122f54d0f5889f9c5fdfb73c3fd9d99895b51" in container runtime
	I1219 06:55:07.083493 2212400 cri.go:221] Removing image: gcr.io/k8s-minikube/storage-provisioner:v5
	I1219 06:55:07.083567 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:55:07.087772 2212400 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi gcr.io/k8s-minikube/storage-provisioner:v5
	I1219 06:55:07.262774 2212400 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5
	I1219 06:55:07.262991 2212400 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/storage-provisioner_v5
	I1219 06:55:07.266874 2212400 ssh_runner.go:352] existence check for /var/lib/minikube/images/storage-provisioner_v5: stat -c "%s %y" /var/lib/minikube/images/storage-provisioner_v5: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/storage-provisioner_v5': No such file or directory
	I1219 06:55:07.266924 2212400 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 --> /var/lib/minikube/images/storage-provisioner_v5 (8035840 bytes)
	I1219 06:55:07.353086 2212400 containerd.go:285] Loading image: /var/lib/minikube/images/storage-provisioner_v5
	I1219 06:55:07.353156 2212400 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/storage-provisioner_v5
	I1219 06:55:08.256740 2212400 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 from cache
	I1219 06:55:08.256820 2212400 cache_images.go:94] duration metric: took 2.636463325s to LoadCachedImages
	W1219 06:55:08.256916 2212400 out.go:285] X Unable to load cached images: LoadCachedImages: stat /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-rc.1: no such file or directory
	X Unable to load cached images: LoadCachedImages: stat /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-rc.1: no such file or directory
	I1219 06:55:08.256931 2212400 kubeadm.go:935] updating node { 192.168.76.2 8443 v1.35.0-rc.1 containerd true true} ...
	I1219 06:55:08.257032 2212400 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-rc.1/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=kubernetes-upgrade-352421 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.76.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-rc.1 ClusterName:kubernetes-upgrade-352421 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1219 06:55:08.257101 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s info
	I1219 06:55:08.281922 2212400 cni.go:84] Creating CNI manager for ""
	I1219 06:55:08.281945 2212400 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1219 06:55:08.281965 2212400 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1219 06:55:08.281988 2212400 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.76.2 APIServerPort:8443 KubernetesVersion:v1.35.0-rc.1 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:kubernetes-upgrade-352421 NodeName:kubernetes-upgrade-352421 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.76.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.76.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/ce
rts/ca.crt StaticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1219 06:55:08.282118 2212400 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.76.2
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "kubernetes-upgrade-352421"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.76.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.76.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-rc.1
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1219 06:55:08.282192 2212400 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-rc.1
	I1219 06:55:08.296243 2212400 binaries.go:51] Found k8s binaries, skipping transfer
	I1219 06:55:08.296319 2212400 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1219 06:55:08.304927 2212400 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (334 bytes)
	I1219 06:55:08.318273 2212400 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (357 bytes)
	I1219 06:55:08.331990 2212400 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2243 bytes)
	I1219 06:55:08.345966 2212400 ssh_runner.go:195] Run: grep 192.168.76.2	control-plane.minikube.internal$ /etc/hosts
	I1219 06:55:08.349729 2212400 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.76.2	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I1219 06:55:08.365670 2212400 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1219 06:55:08.595209 2212400 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1219 06:55:08.615772 2212400 certs.go:69] Setting up /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/kubernetes-upgrade-352421 for IP: 192.168.76.2
	I1219 06:55:08.615837 2212400 certs.go:195] generating shared ca certs ...
	I1219 06:55:08.615867 2212400 certs.go:227] acquiring lock for ca certs: {Name:mk382c71693ea4061363f97b153b21bf6cdf5f38 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1219 06:55:08.616038 2212400 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.key
	I1219 06:55:08.616109 2212400 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/proxy-client-ca.key
	I1219 06:55:08.616130 2212400 certs.go:257] generating profile certs ...
	I1219 06:55:08.616263 2212400 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/kubernetes-upgrade-352421/client.key
	I1219 06:55:08.616380 2212400 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/kubernetes-upgrade-352421/apiserver.key.82e79b18
	I1219 06:55:08.616445 2212400 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/kubernetes-upgrade-352421/proxy-client.key
	I1219 06:55:08.616585 2212400 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/2000386.pem (1338 bytes)
	W1219 06:55:08.616647 2212400 certs.go:480] ignoring /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/2000386_empty.pem, impossibly tiny 0 bytes
	I1219 06:55:08.616671 2212400 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca-key.pem (1679 bytes)
	I1219 06:55:08.616726 2212400 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem (1078 bytes)
	I1219 06:55:08.616911 2212400 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/cert.pem (1123 bytes)
	I1219 06:55:08.616973 2212400 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/key.pem (1671 bytes)
	I1219 06:55:08.617050 2212400 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem (1708 bytes)
	I1219 06:55:08.617637 2212400 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1219 06:55:08.638940 2212400 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1219 06:55:08.682985 2212400 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1219 06:55:08.708978 2212400 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I1219 06:55:08.740865 2212400 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/kubernetes-upgrade-352421/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1436 bytes)
	I1219 06:55:08.767461 2212400 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/kubernetes-upgrade-352421/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1219 06:55:08.788285 2212400 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/kubernetes-upgrade-352421/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1219 06:55:08.816435 2212400 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/kubernetes-upgrade-352421/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I1219 06:55:08.846622 2212400 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1219 06:55:08.872663 2212400 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/2000386.pem --> /usr/share/ca-certificates/2000386.pem (1338 bytes)
	I1219 06:55:08.900622 2212400 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem --> /usr/share/ca-certificates/20003862.pem (1708 bytes)
	I1219 06:55:08.919771 2212400 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (722 bytes)
	I1219 06:55:08.955730 2212400 ssh_runner.go:195] Run: openssl version
	I1219 06:55:08.968197 2212400 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:55:08.977689 2212400 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1219 06:55:08.987491 2212400 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:55:08.992566 2212400 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 19 05:43 /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:55:08.992633 2212400 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1219 06:55:09.044078 2212400 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1219 06:55:09.052351 2212400 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/2000386.pem
	I1219 06:55:09.059945 2212400 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/2000386.pem /etc/ssl/certs/2000386.pem
	I1219 06:55:09.067630 2212400 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/2000386.pem
	I1219 06:55:09.072621 2212400 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 19 05:57 /usr/share/ca-certificates/2000386.pem
	I1219 06:55:09.072742 2212400 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/2000386.pem
	I1219 06:55:09.126931 2212400 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1219 06:55:09.146345 2212400 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/20003862.pem
	I1219 06:55:09.158660 2212400 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/20003862.pem /etc/ssl/certs/20003862.pem
	I1219 06:55:09.170470 2212400 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/20003862.pem
	I1219 06:55:09.179407 2212400 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 19 05:57 /usr/share/ca-certificates/20003862.pem
	I1219 06:55:09.179472 2212400 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/20003862.pem
	I1219 06:55:09.269867 2212400 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1219 06:55:09.286220 2212400 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1219 06:55:09.293447 2212400 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1219 06:55:09.409444 2212400 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1219 06:55:09.499218 2212400 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1219 06:55:09.648579 2212400 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1219 06:55:09.765284 2212400 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1219 06:55:09.850109 2212400 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1219 06:55:09.936115 2212400 kubeadm.go:401] StartCluster: {Name:kubernetes-upgrade-352421 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:kubernetes-upgrade-352421 Namespace:default APIServerHAVIP: APIServerN
ame:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.76.2 Port:8443 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFi
rmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1219 06:55:09.936244 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1219 06:55:09.936332 2212400 ssh_runner.go:195] Run: sudo -s eval "crictl --timeout=10s ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1219 06:55:10.052691 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:55:10.052847 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:55:10.052871 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:55:10.052889 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:55:10.052905 2212400 cri.go:92] found id: ""
	I1219 06:55:10.052990 2212400 ssh_runner.go:195] Run: sudo runc --root /run/containerd/runc/k8s.io list -f json
	W1219 06:55:10.100205 2212400 kubeadm.go:408] unpause failed: list paused: runc: sudo runc --root /run/containerd/runc/k8s.io list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-19T06:55:10Z" level=error msg="open /run/containerd/runc/k8s.io: no such file or directory"
	I1219 06:55:10.100277 2212400 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1219 06:55:10.112915 2212400 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1219 06:55:10.112935 2212400 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1219 06:55:10.112989 2212400 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1219 06:55:10.134926 2212400 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1219 06:55:10.135333 2212400 kubeconfig.go:47] verify endpoint returned: get endpoint: "kubernetes-upgrade-352421" does not appear in /home/jenkins/minikube-integration/22230-1998525/kubeconfig
	I1219 06:55:10.135435 2212400 kubeconfig.go:62] /home/jenkins/minikube-integration/22230-1998525/kubeconfig needs updating (will repair): [kubeconfig missing "kubernetes-upgrade-352421" cluster setting kubeconfig missing "kubernetes-upgrade-352421" context setting]
	I1219 06:55:10.135761 2212400 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22230-1998525/kubeconfig: {Name:mk7db1732c7d76f01100426cb283dc7515a3b9ea Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1219 06:55:10.136273 2212400 kapi.go:59] client config for kubernetes-upgrade-352421: &rest.Config{Host:"https://192.168.76.2:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/kubernetes-upgrade-352421/client.crt", KeyFile:"/home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/kubernetes-upgrade-352421/client.key", CAFile:"/home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8
(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1ffe230), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1219 06:55:10.138181 2212400 envvar.go:172] "Feature gate default state" feature="ClientsPreferCBOR" enabled=false
	I1219 06:55:10.138240 2212400 envvar.go:172] "Feature gate default state" feature="InOrderInformers" enabled=true
	I1219 06:55:10.138257 2212400 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I1219 06:55:10.138271 2212400 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I1219 06:55:10.138276 2212400 envvar.go:172] "Feature gate default state" feature="ClientsAllowCBOR" enabled=false
	I1219 06:55:10.139548 2212400 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1219 06:55:10.154112 2212400 kubeadm.go:645] detected kubeadm config drift (will reconfigure cluster from new /var/tmp/minikube/kubeadm.yaml):
	-- stdout --
	--- /var/tmp/minikube/kubeadm.yaml	2025-12-19 06:54:29.446342459 +0000
	+++ /var/tmp/minikube/kubeadm.yaml.new	2025-12-19 06:55:08.338732660 +0000
	@@ -1,4 +1,4 @@
	-apiVersion: kubeadm.k8s.io/v1beta3
	+apiVersion: kubeadm.k8s.io/v1beta4
	 kind: InitConfiguration
	 localAPIEndpoint:
	   advertiseAddress: 192.168.76.2
	@@ -14,31 +14,34 @@
	   criSocket: unix:///run/containerd/containerd.sock
	   name: "kubernetes-upgrade-352421"
	   kubeletExtraArgs:
	-    node-ip: 192.168.76.2
	+    - name: "node-ip"
	+      value: "192.168.76.2"
	   taints: []
	 ---
	-apiVersion: kubeadm.k8s.io/v1beta3
	+apiVersion: kubeadm.k8s.io/v1beta4
	 kind: ClusterConfiguration
	 apiServer:
	   certSANs: ["127.0.0.1", "localhost", "192.168.76.2"]
	   extraArgs:
	-    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	+    - name: "enable-admission-plugins"
	+      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	 controllerManager:
	   extraArgs:
	-    allocate-node-cidrs: "true"
	-    leader-elect: "false"
	+    - name: "allocate-node-cidrs"
	+      value: "true"
	+    - name: "leader-elect"
	+      value: "false"
	 scheduler:
	   extraArgs:
	-    leader-elect: "false"
	+    - name: "leader-elect"
	+      value: "false"
	 certificatesDir: /var/lib/minikube/certs
	 clusterName: mk
	 controlPlaneEndpoint: control-plane.minikube.internal:8443
	 etcd:
	   local:
	     dataDir: /var/lib/minikube/etcd
	-    extraArgs:
	-      proxy-refresh-interval: "70000"
	-kubernetesVersion: v1.28.0
	+kubernetesVersion: v1.35.0-rc.1
	 networking:
	   dnsDomain: cluster.local
	   podSubnet: "10.244.0.0/16"
	
	-- /stdout --
	I1219 06:55:10.154136 2212400 kubeadm.go:1161] stopping kube-system containers ...
	I1219 06:55:10.154151 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name: Namespaces:[kube-system]}
	I1219 06:55:10.154220 2212400 ssh_runner.go:195] Run: sudo -s eval "crictl --timeout=10s ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1219 06:55:10.234326 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:55:10.234459 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:55:10.234466 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:55:10.234470 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:55:10.234474 2212400 cri.go:92] found id: ""
	I1219 06:55:10.234479 2212400 cri.go:255] Stopping containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:55:10.234561 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:55:10.247169 2212400 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl stop --timeout=10 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5
	I1219 06:55:10.298306 2212400 ssh_runner.go:195] Run: sudo systemctl stop kubelet
	I1219 06:55:10.326394 2212400 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1219 06:55:10.341204 2212400 kubeadm.go:158] found existing configuration files:
	-rw------- 1 root root 5639 Dec 19 06:54 /etc/kubernetes/admin.conf
	-rw------- 1 root root 5652 Dec 19 06:54 /etc/kubernetes/controller-manager.conf
	-rw------- 1 root root 2039 Dec 19 06:54 /etc/kubernetes/kubelet.conf
	-rw------- 1 root root 5604 Dec 19 06:54 /etc/kubernetes/scheduler.conf
	
	I1219 06:55:10.341287 2212400 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I1219 06:55:10.356225 2212400 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I1219 06:55:10.365941 2212400 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I1219 06:55:10.376339 2212400 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1219 06:55:10.376408 2212400 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1219 06:55:10.385371 2212400 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I1219 06:55:10.398169 2212400 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1219 06:55:10.398263 2212400 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1219 06:55:10.405980 2212400 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1219 06:55:10.429561 2212400 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase certs all --config /var/tmp/minikube/kubeadm.yaml"
	I1219 06:55:10.513435 2212400 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml"
	I1219 06:55:11.639281 2212400 ssh_runner.go:235] Completed: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml": (1.12580764s)
	I1219 06:55:11.639382 2212400 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase kubelet-start --config /var/tmp/minikube/kubeadm.yaml"
	I1219 06:55:11.904047 2212400 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase control-plane all --config /var/tmp/minikube/kubeadm.yaml"
	I1219 06:55:11.997010 2212400 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init phase etcd local --config /var/tmp/minikube/kubeadm.yaml"
	I1219 06:55:12.059840 2212400 api_server.go:52] waiting for apiserver process to appear ...
	I1219 06:55:12.059928 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:12.560079 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:13.060524 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:13.560572 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:14.060951 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:14.560791 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:15.060094 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:15.560073 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:16.060052 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:16.560083 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:17.060926 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:17.560584 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:18.060886 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:18.560116 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:19.060652 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:19.560115 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:20.060117 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:20.561035 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:21.060085 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:21.560132 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:22.060105 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:22.560127 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:23.060950 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:23.560086 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:24.060548 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:24.560117 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:25.060879 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:25.560052 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:26.060657 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:26.560443 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:27.060056 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:27.560853 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:28.060860 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:28.560686 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:29.060588 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:29.560868 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:30.061053 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:30.560392 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:31.060458 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:31.560724 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:32.060081 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:32.560056 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:33.060819 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:33.560344 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:34.060053 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:34.560686 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:35.060967 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:35.560893 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:36.060190 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:36.560930 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:37.061043 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:37.561071 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:38.061063 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:38.560786 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:39.060872 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:39.560194 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:40.060027 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:40.560084 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:41.060012 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:41.560662 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:42.060888 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:42.559995 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:43.060641 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:43.560041 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:44.060867 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:44.560729 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:45.061375 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:45.560665 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:46.060696 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:46.560163 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:47.060425 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:47.560737 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:48.060851 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:48.560067 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:49.060748 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:49.560100 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:50.060148 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:50.560866 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:51.060844 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:51.560040 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:52.061010 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:52.560187 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:53.060730 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:53.560648 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:54.060161 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:54.560957 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:55.060689 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:55.560225 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:56.060319 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:56.560713 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:57.060165 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:57.560688 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:58.060019 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:58.561014 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:59.060136 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:55:59.560847 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:56:00.060984 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:56:00.560170 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:56:01.060142 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:56:01.560943 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:56:02.060856 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:56:02.560836 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:56:03.060930 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:56:03.561008 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:56:04.060806 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:56:04.560094 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:56:05.060679 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:56:05.560118 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:56:06.060286 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:56:06.560505 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:56:07.060107 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:56:07.560583 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:56:08.060308 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:56:08.560134 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:56:09.060009 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:56:09.560814 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:56:10.060716 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:56:10.560033 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:56:11.060687 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:56:11.560804 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:56:12.060250 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:56:12.060342 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:56:12.088808 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:56:12.088833 2212400 cri.go:92] found id: ""
	I1219 06:56:12.088842 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:56:12.088900 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:12.092468 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:56:12.092543 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:56:12.117745 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:56:12.117772 2212400 cri.go:92] found id: ""
	I1219 06:56:12.117787 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:56:12.117845 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:12.121749 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:56:12.121821 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:56:12.150647 2212400 cri.go:92] found id: ""
	I1219 06:56:12.150671 2212400 logs.go:282] 0 containers: []
	W1219 06:56:12.150680 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:56:12.150686 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:56:12.150760 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:56:12.174555 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:56:12.174578 2212400 cri.go:92] found id: ""
	I1219 06:56:12.174586 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:56:12.174646 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:12.178376 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:56:12.178481 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:56:12.207651 2212400 cri.go:92] found id: ""
	I1219 06:56:12.207679 2212400 logs.go:282] 0 containers: []
	W1219 06:56:12.207698 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:56:12.207706 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:56:12.207773 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:56:12.235942 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:56:12.235967 2212400 cri.go:92] found id: ""
	I1219 06:56:12.235976 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:56:12.236036 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:12.239921 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:56:12.239995 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:56:12.265822 2212400 cri.go:92] found id: ""
	I1219 06:56:12.265849 2212400 logs.go:282] 0 containers: []
	W1219 06:56:12.265857 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:56:12.265863 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:56:12.265932 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:56:12.290461 2212400 cri.go:92] found id: ""
	I1219 06:56:12.290483 2212400 logs.go:282] 0 containers: []
	W1219 06:56:12.290491 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:56:12.290513 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:56:12.290525 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:56:12.307768 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:56:12.307797 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:56:12.351023 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:56:12.351055 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:56:12.399474 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:56:12.399512 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:56:12.428666 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:56:12.428695 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:56:12.493250 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:56:12.493291 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:56:12.561867 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:56:12.561891 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:56:12.561907 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:56:12.599242 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:56:12.599277 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:56:12.635031 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:56:12.635063 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:56:15.175665 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:56:15.186580 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:56:15.186656 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:56:15.211623 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:56:15.211644 2212400 cri.go:92] found id: ""
	I1219 06:56:15.211652 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:56:15.211713 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:15.215575 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:56:15.215649 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:56:15.241004 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:56:15.241025 2212400 cri.go:92] found id: ""
	I1219 06:56:15.241033 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:56:15.241092 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:15.244855 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:56:15.244930 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:56:15.271403 2212400 cri.go:92] found id: ""
	I1219 06:56:15.271426 2212400 logs.go:282] 0 containers: []
	W1219 06:56:15.271435 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:56:15.271441 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:56:15.271502 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:56:15.297561 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:56:15.297582 2212400 cri.go:92] found id: ""
	I1219 06:56:15.297590 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:56:15.297652 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:15.301598 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:56:15.301676 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:56:15.326262 2212400 cri.go:92] found id: ""
	I1219 06:56:15.326287 2212400 logs.go:282] 0 containers: []
	W1219 06:56:15.326296 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:56:15.326302 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:56:15.326365 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:56:15.351989 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:56:15.352013 2212400 cri.go:92] found id: ""
	I1219 06:56:15.352022 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:56:15.352081 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:15.355670 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:56:15.355738 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:56:15.385824 2212400 cri.go:92] found id: ""
	I1219 06:56:15.385851 2212400 logs.go:282] 0 containers: []
	W1219 06:56:15.385861 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:56:15.385867 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:56:15.385930 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:56:15.410395 2212400 cri.go:92] found id: ""
	I1219 06:56:15.410419 2212400 logs.go:282] 0 containers: []
	W1219 06:56:15.410427 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:56:15.410447 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:56:15.410459 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:56:15.444542 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:56:15.444575 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:56:15.493391 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:56:15.493425 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:56:15.523561 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:56:15.523600 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:56:15.540625 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:56:15.540652 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:56:15.588906 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:56:15.588946 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:56:15.625468 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:56:15.625505 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:56:15.677557 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:56:15.677627 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:56:15.742158 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:56:15.742201 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:56:15.813156 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:56:18.314041 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:56:18.324456 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:56:18.324526 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:56:18.350900 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:56:18.350929 2212400 cri.go:92] found id: ""
	I1219 06:56:18.350937 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:56:18.350996 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:18.355188 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:56:18.355267 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:56:18.383610 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:56:18.383630 2212400 cri.go:92] found id: ""
	I1219 06:56:18.383639 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:56:18.383696 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:18.387532 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:56:18.387614 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:56:18.415460 2212400 cri.go:92] found id: ""
	I1219 06:56:18.415483 2212400 logs.go:282] 0 containers: []
	W1219 06:56:18.415492 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:56:18.415498 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:56:18.415559 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:56:18.440803 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:56:18.440827 2212400 cri.go:92] found id: ""
	I1219 06:56:18.440836 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:56:18.440899 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:18.444949 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:56:18.445033 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:56:18.473622 2212400 cri.go:92] found id: ""
	I1219 06:56:18.473645 2212400 logs.go:282] 0 containers: []
	W1219 06:56:18.473653 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:56:18.473660 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:56:18.473722 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:56:18.518845 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:56:18.518870 2212400 cri.go:92] found id: ""
	I1219 06:56:18.518880 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:56:18.518940 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:18.523013 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:56:18.523096 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:56:18.561921 2212400 cri.go:92] found id: ""
	I1219 06:56:18.561947 2212400 logs.go:282] 0 containers: []
	W1219 06:56:18.561956 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:56:18.561962 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:56:18.562028 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:56:18.596736 2212400 cri.go:92] found id: ""
	I1219 06:56:18.596779 2212400 logs.go:282] 0 containers: []
	W1219 06:56:18.596788 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:56:18.596803 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:56:18.596814 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:56:18.688312 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:56:18.688337 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:56:18.688351 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:56:18.727965 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:56:18.728002 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:56:18.799564 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:56:18.799597 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:56:18.851814 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:56:18.851850 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:56:18.912286 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:56:18.912317 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:56:18.947931 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:56:18.947967 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:56:18.998869 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:56:18.998898 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:56:19.062814 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:56:19.062892 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:56:21.582467 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:56:21.592886 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:56:21.592970 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:56:21.624683 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:56:21.624706 2212400 cri.go:92] found id: ""
	I1219 06:56:21.624715 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:56:21.624830 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:21.628531 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:56:21.628620 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:56:21.662922 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:56:21.662948 2212400 cri.go:92] found id: ""
	I1219 06:56:21.662956 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:56:21.663014 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:21.667084 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:56:21.667162 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:56:21.695536 2212400 cri.go:92] found id: ""
	I1219 06:56:21.695561 2212400 logs.go:282] 0 containers: []
	W1219 06:56:21.695570 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:56:21.695576 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:56:21.695639 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:56:21.724117 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:56:21.724140 2212400 cri.go:92] found id: ""
	I1219 06:56:21.724148 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:56:21.724213 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:21.728004 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:56:21.728077 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:56:21.753237 2212400 cri.go:92] found id: ""
	I1219 06:56:21.753260 2212400 logs.go:282] 0 containers: []
	W1219 06:56:21.753268 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:56:21.753275 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:56:21.753385 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:56:21.779949 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:56:21.779974 2212400 cri.go:92] found id: ""
	I1219 06:56:21.779983 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:56:21.780042 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:21.783804 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:56:21.783878 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:56:21.809866 2212400 cri.go:92] found id: ""
	I1219 06:56:21.809890 2212400 logs.go:282] 0 containers: []
	W1219 06:56:21.809898 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:56:21.809905 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:56:21.810004 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:56:21.835517 2212400 cri.go:92] found id: ""
	I1219 06:56:21.835544 2212400 logs.go:282] 0 containers: []
	W1219 06:56:21.835553 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:56:21.835567 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:56:21.835577 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:56:21.894514 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:56:21.894549 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:56:21.976884 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:56:21.976904 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:56:21.976921 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:56:22.012320 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:56:22.012354 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:56:22.042294 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:56:22.042336 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:56:22.072193 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:56:22.072225 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:56:22.089335 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:56:22.089364 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:56:22.125930 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:56:22.125964 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:56:22.163349 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:56:22.163384 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:56:24.702748 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:56:24.712921 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:56:24.712996 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:56:24.738203 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:56:24.738226 2212400 cri.go:92] found id: ""
	I1219 06:56:24.738234 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:56:24.738290 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:24.741927 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:56:24.742033 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:56:24.765840 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:56:24.765861 2212400 cri.go:92] found id: ""
	I1219 06:56:24.765869 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:56:24.765926 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:24.769701 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:56:24.769777 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:56:24.794350 2212400 cri.go:92] found id: ""
	I1219 06:56:24.794374 2212400 logs.go:282] 0 containers: []
	W1219 06:56:24.794383 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:56:24.794389 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:56:24.794450 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:56:24.819580 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:56:24.819603 2212400 cri.go:92] found id: ""
	I1219 06:56:24.819612 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:56:24.819693 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:24.823586 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:56:24.823680 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:56:24.849067 2212400 cri.go:92] found id: ""
	I1219 06:56:24.849092 2212400 logs.go:282] 0 containers: []
	W1219 06:56:24.849101 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:56:24.849107 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:56:24.849193 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:56:24.875605 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:56:24.875629 2212400 cri.go:92] found id: ""
	I1219 06:56:24.875638 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:56:24.875696 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:24.879519 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:56:24.879593 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:56:24.906922 2212400 cri.go:92] found id: ""
	I1219 06:56:24.906950 2212400 logs.go:282] 0 containers: []
	W1219 06:56:24.906961 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:56:24.906967 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:56:24.907033 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:56:24.932721 2212400 cri.go:92] found id: ""
	I1219 06:56:24.932749 2212400 logs.go:282] 0 containers: []
	W1219 06:56:24.932778 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:56:24.932792 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:56:24.932804 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:56:24.997450 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:56:24.997469 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:56:24.997481 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:56:25.039229 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:56:25.039261 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:56:25.073957 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:56:25.073992 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:56:25.136322 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:56:25.136355 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:56:25.153267 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:56:25.153297 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:56:25.197592 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:56:25.197627 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:56:25.230327 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:56:25.230356 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:56:25.260783 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:56:25.260815 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:56:27.793410 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:56:27.803438 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:56:27.803508 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:56:27.828529 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:56:27.828549 2212400 cri.go:92] found id: ""
	I1219 06:56:27.828558 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:56:27.828616 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:27.832302 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:56:27.832380 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:56:27.857038 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:56:27.857059 2212400 cri.go:92] found id: ""
	I1219 06:56:27.857068 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:56:27.857127 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:27.860723 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:56:27.860836 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:56:27.885492 2212400 cri.go:92] found id: ""
	I1219 06:56:27.885517 2212400 logs.go:282] 0 containers: []
	W1219 06:56:27.885525 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:56:27.885532 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:56:27.885591 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:56:27.910265 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:56:27.910304 2212400 cri.go:92] found id: ""
	I1219 06:56:27.910314 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:56:27.910378 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:27.913892 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:56:27.913964 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:56:27.938547 2212400 cri.go:92] found id: ""
	I1219 06:56:27.938571 2212400 logs.go:282] 0 containers: []
	W1219 06:56:27.938580 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:56:27.938586 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:56:27.938649 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:56:27.963149 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:56:27.963171 2212400 cri.go:92] found id: ""
	I1219 06:56:27.963180 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:56:27.963242 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:27.966881 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:56:27.966952 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:56:27.996103 2212400 cri.go:92] found id: ""
	I1219 06:56:27.996127 2212400 logs.go:282] 0 containers: []
	W1219 06:56:27.996135 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:56:27.996141 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:56:27.996231 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:56:28.029951 2212400 cri.go:92] found id: ""
	I1219 06:56:28.029974 2212400 logs.go:282] 0 containers: []
	W1219 06:56:28.029983 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:56:28.029997 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:56:28.030036 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:56:28.088090 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:56:28.088122 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:56:28.121966 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:56:28.122001 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:56:28.154371 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:56:28.154404 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:56:28.187456 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:56:28.187489 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:56:28.218114 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:56:28.218149 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:56:28.247294 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:56:28.247323 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:56:28.263524 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:56:28.263555 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:56:28.330725 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:56:28.330793 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:56:28.330812 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:56:30.868289 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:56:30.878407 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:56:30.878485 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:56:30.903979 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:56:30.904004 2212400 cri.go:92] found id: ""
	I1219 06:56:30.904012 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:56:30.904072 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:30.907617 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:56:30.907694 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:56:30.932502 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:56:30.932525 2212400 cri.go:92] found id: ""
	I1219 06:56:30.932534 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:56:30.932593 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:30.936242 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:56:30.936316 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:56:30.959729 2212400 cri.go:92] found id: ""
	I1219 06:56:30.959753 2212400 logs.go:282] 0 containers: []
	W1219 06:56:30.959761 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:56:30.959768 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:56:30.959831 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:56:30.985994 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:56:30.986018 2212400 cri.go:92] found id: ""
	I1219 06:56:30.986027 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:56:30.986090 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:30.989827 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:56:30.989900 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:56:31.031136 2212400 cri.go:92] found id: ""
	I1219 06:56:31.031161 2212400 logs.go:282] 0 containers: []
	W1219 06:56:31.031171 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:56:31.031178 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:56:31.031241 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:56:31.061943 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:56:31.061967 2212400 cri.go:92] found id: ""
	I1219 06:56:31.061976 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:56:31.062033 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:31.065771 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:56:31.065866 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:56:31.107792 2212400 cri.go:92] found id: ""
	I1219 06:56:31.107815 2212400 logs.go:282] 0 containers: []
	W1219 06:56:31.107824 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:56:31.107831 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:56:31.107892 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:56:31.132333 2212400 cri.go:92] found id: ""
	I1219 06:56:31.132368 2212400 logs.go:282] 0 containers: []
	W1219 06:56:31.132377 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:56:31.132394 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:56:31.132406 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:56:31.149096 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:56:31.149130 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:56:31.190008 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:56:31.190039 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:56:31.226303 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:56:31.226350 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:56:31.262094 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:56:31.262128 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:56:31.321022 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:56:31.321053 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:56:31.385616 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:56:31.385640 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:56:31.385653 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:56:31.441756 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:56:31.441808 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:56:31.478005 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:56:31.478039 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:56:34.008832 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:56:34.020205 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:56:34.020279 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:56:34.046458 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:56:34.046483 2212400 cri.go:92] found id: ""
	I1219 06:56:34.046492 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:56:34.046555 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:34.050395 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:56:34.050496 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:56:34.076980 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:56:34.077009 2212400 cri.go:92] found id: ""
	I1219 06:56:34.077019 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:56:34.077086 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:34.081121 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:56:34.081197 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:56:34.108156 2212400 cri.go:92] found id: ""
	I1219 06:56:34.108183 2212400 logs.go:282] 0 containers: []
	W1219 06:56:34.108193 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:56:34.108199 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:56:34.108261 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:56:34.134409 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:56:34.134431 2212400 cri.go:92] found id: ""
	I1219 06:56:34.134440 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:56:34.134518 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:34.138255 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:56:34.138333 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:56:34.168015 2212400 cri.go:92] found id: ""
	I1219 06:56:34.168039 2212400 logs.go:282] 0 containers: []
	W1219 06:56:34.168048 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:56:34.168063 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:56:34.168125 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:56:34.197674 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:56:34.197696 2212400 cri.go:92] found id: ""
	I1219 06:56:34.197704 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:56:34.197781 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:34.201373 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:56:34.201457 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:56:34.226349 2212400 cri.go:92] found id: ""
	I1219 06:56:34.226390 2212400 logs.go:282] 0 containers: []
	W1219 06:56:34.226401 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:56:34.226407 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:56:34.226493 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:56:34.252809 2212400 cri.go:92] found id: ""
	I1219 06:56:34.252887 2212400 logs.go:282] 0 containers: []
	W1219 06:56:34.252910 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:56:34.252939 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:56:34.252981 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:56:34.317596 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:56:34.317619 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:56:34.317632 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:56:34.353971 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:56:34.354003 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:56:34.395840 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:56:34.395875 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:56:34.432603 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:56:34.432645 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:56:34.469758 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:56:34.469788 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:56:34.533120 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:56:34.533156 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:56:34.550280 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:56:34.550307 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:56:34.582816 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:56:34.582847 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:56:37.112333 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:56:37.123737 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:56:37.123841 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:56:37.148224 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:56:37.148247 2212400 cri.go:92] found id: ""
	I1219 06:56:37.148256 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:56:37.148312 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:37.151895 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:56:37.151986 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:56:37.183280 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:56:37.183303 2212400 cri.go:92] found id: ""
	I1219 06:56:37.183311 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:56:37.183369 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:37.187066 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:56:37.187143 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:56:37.214404 2212400 cri.go:92] found id: ""
	I1219 06:56:37.214432 2212400 logs.go:282] 0 containers: []
	W1219 06:56:37.214442 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:56:37.214449 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:56:37.214524 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:56:37.243548 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:56:37.243572 2212400 cri.go:92] found id: ""
	I1219 06:56:37.243580 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:56:37.243652 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:37.247130 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:56:37.247250 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:56:37.273607 2212400 cri.go:92] found id: ""
	I1219 06:56:37.273684 2212400 logs.go:282] 0 containers: []
	W1219 06:56:37.273706 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:56:37.273727 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:56:37.273817 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:56:37.303918 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:56:37.303988 2212400 cri.go:92] found id: ""
	I1219 06:56:37.304025 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:56:37.304118 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:37.307899 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:56:37.307995 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:56:37.334730 2212400 cri.go:92] found id: ""
	I1219 06:56:37.334758 2212400 logs.go:282] 0 containers: []
	W1219 06:56:37.334767 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:56:37.334773 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:56:37.334885 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:56:37.368646 2212400 cri.go:92] found id: ""
	I1219 06:56:37.368677 2212400 logs.go:282] 0 containers: []
	W1219 06:56:37.368687 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:56:37.368700 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:56:37.368712 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:56:37.384977 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:56:37.385009 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:56:37.471402 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:56:37.471465 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:56:37.471494 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:56:37.506739 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:56:37.506771 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:56:37.539982 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:56:37.540012 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:56:37.574839 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:56:37.574869 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:56:37.603817 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:56:37.603845 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:56:37.667304 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:56:37.667345 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:56:37.703873 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:56:37.703915 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:56:40.234900 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:56:40.245429 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:56:40.245503 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:56:40.270577 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:56:40.270602 2212400 cri.go:92] found id: ""
	I1219 06:56:40.270610 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:56:40.270670 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:40.274528 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:56:40.274650 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:56:40.302474 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:56:40.302497 2212400 cri.go:92] found id: ""
	I1219 06:56:40.302505 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:56:40.302565 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:40.306366 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:56:40.306439 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:56:40.331755 2212400 cri.go:92] found id: ""
	I1219 06:56:40.331780 2212400 logs.go:282] 0 containers: []
	W1219 06:56:40.331791 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:56:40.331798 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:56:40.331857 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:56:40.357094 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:56:40.357117 2212400 cri.go:92] found id: ""
	I1219 06:56:40.357126 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:56:40.357184 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:40.360944 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:56:40.361023 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:56:40.386461 2212400 cri.go:92] found id: ""
	I1219 06:56:40.386485 2212400 logs.go:282] 0 containers: []
	W1219 06:56:40.386494 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:56:40.386500 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:56:40.386585 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:56:40.428990 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:56:40.429013 2212400 cri.go:92] found id: ""
	I1219 06:56:40.429021 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:56:40.429077 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:40.433133 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:56:40.433208 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:56:40.465655 2212400 cri.go:92] found id: ""
	I1219 06:56:40.465680 2212400 logs.go:282] 0 containers: []
	W1219 06:56:40.465689 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:56:40.465695 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:56:40.465792 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:56:40.499601 2212400 cri.go:92] found id: ""
	I1219 06:56:40.499626 2212400 logs.go:282] 0 containers: []
	W1219 06:56:40.499635 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:56:40.499682 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:56:40.499701 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:56:40.518056 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:56:40.518086 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:56:40.586012 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:56:40.586035 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:56:40.586048 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:56:40.624713 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:56:40.624748 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:56:40.657552 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:56:40.657586 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:56:40.700391 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:56:40.700426 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:56:40.738551 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:56:40.738585 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:56:40.769417 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:56:40.769451 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:56:40.799767 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:56:40.799796 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:56:43.359198 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:56:43.371807 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:56:43.371877 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:56:43.400639 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:56:43.400661 2212400 cri.go:92] found id: ""
	I1219 06:56:43.400669 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:56:43.400727 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:43.405193 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:56:43.405265 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:56:43.433598 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:56:43.433623 2212400 cri.go:92] found id: ""
	I1219 06:56:43.433631 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:56:43.433691 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:43.437920 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:56:43.437992 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:56:43.466332 2212400 cri.go:92] found id: ""
	I1219 06:56:43.466357 2212400 logs.go:282] 0 containers: []
	W1219 06:56:43.466365 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:56:43.466372 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:56:43.466440 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:56:43.494237 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:56:43.494259 2212400 cri.go:92] found id: ""
	I1219 06:56:43.494267 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:56:43.494348 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:43.498134 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:56:43.498242 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:56:43.526323 2212400 cri.go:92] found id: ""
	I1219 06:56:43.526347 2212400 logs.go:282] 0 containers: []
	W1219 06:56:43.526356 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:56:43.526363 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:56:43.526449 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:56:43.553499 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:56:43.553522 2212400 cri.go:92] found id: ""
	I1219 06:56:43.553530 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:56:43.553589 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:43.557842 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:56:43.557915 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:56:43.582439 2212400 cri.go:92] found id: ""
	I1219 06:56:43.582465 2212400 logs.go:282] 0 containers: []
	W1219 06:56:43.582474 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:56:43.582481 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:56:43.582543 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:56:43.613161 2212400 cri.go:92] found id: ""
	I1219 06:56:43.613186 2212400 logs.go:282] 0 containers: []
	W1219 06:56:43.613196 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:56:43.613211 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:56:43.613223 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:56:43.629731 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:56:43.629762 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:56:43.663500 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:56:43.663580 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:56:43.722741 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:56:43.722776 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:56:43.789542 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:56:43.789585 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:56:43.789600 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:56:43.827349 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:56:43.827381 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:56:43.863989 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:56:43.864023 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:56:43.903890 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:56:43.903921 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:56:43.934213 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:56:43.934245 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:56:46.464910 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:56:46.475521 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:56:46.475589 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:56:46.504146 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:56:46.504209 2212400 cri.go:92] found id: ""
	I1219 06:56:46.504232 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:56:46.504302 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:46.508017 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:56:46.508128 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:56:46.534903 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:56:46.534966 2212400 cri.go:92] found id: ""
	I1219 06:56:46.534988 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:56:46.535061 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:46.538611 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:56:46.538683 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:56:46.564643 2212400 cri.go:92] found id: ""
	I1219 06:56:46.564718 2212400 logs.go:282] 0 containers: []
	W1219 06:56:46.564741 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:56:46.564805 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:56:46.564900 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:56:46.591266 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:56:46.591285 2212400 cri.go:92] found id: ""
	I1219 06:56:46.591293 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:56:46.591357 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:46.595018 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:56:46.595101 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:56:46.623176 2212400 cri.go:92] found id: ""
	I1219 06:56:46.623201 2212400 logs.go:282] 0 containers: []
	W1219 06:56:46.623209 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:56:46.623215 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:56:46.623277 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:56:46.648750 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:56:46.648808 2212400 cri.go:92] found id: ""
	I1219 06:56:46.648817 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:56:46.648874 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:46.652621 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:56:46.652695 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:56:46.679147 2212400 cri.go:92] found id: ""
	I1219 06:56:46.679175 2212400 logs.go:282] 0 containers: []
	W1219 06:56:46.679184 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:56:46.679190 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:56:46.679255 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:56:46.703809 2212400 cri.go:92] found id: ""
	I1219 06:56:46.703835 2212400 logs.go:282] 0 containers: []
	W1219 06:56:46.703844 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:56:46.703857 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:56:46.703870 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:56:46.740145 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:56:46.740228 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:56:46.800064 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:56:46.800099 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:56:46.831229 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:56:46.831268 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:56:46.848155 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:56:46.848186 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:56:46.914058 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:56:46.914079 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:56:46.914092 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:56:46.950935 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:56:46.950967 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:56:46.991671 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:56:46.991705 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:56:47.029730 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:56:47.029761 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:56:49.569305 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:56:49.579761 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:56:49.579836 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:56:49.609289 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:56:49.609312 2212400 cri.go:92] found id: ""
	I1219 06:56:49.609323 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:56:49.609383 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:49.613241 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:56:49.613325 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:56:49.639019 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:56:49.639040 2212400 cri.go:92] found id: ""
	I1219 06:56:49.639048 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:56:49.639107 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:49.642727 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:56:49.642802 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:56:49.671807 2212400 cri.go:92] found id: ""
	I1219 06:56:49.671830 2212400 logs.go:282] 0 containers: []
	W1219 06:56:49.671838 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:56:49.671845 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:56:49.671907 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:56:49.701038 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:56:49.701061 2212400 cri.go:92] found id: ""
	I1219 06:56:49.701069 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:56:49.701127 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:49.704841 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:56:49.704913 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:56:49.730394 2212400 cri.go:92] found id: ""
	I1219 06:56:49.730419 2212400 logs.go:282] 0 containers: []
	W1219 06:56:49.730428 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:56:49.730435 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:56:49.730495 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:56:49.755701 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:56:49.755727 2212400 cri.go:92] found id: ""
	I1219 06:56:49.755736 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:56:49.755800 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:49.759492 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:56:49.759565 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:56:49.784966 2212400 cri.go:92] found id: ""
	I1219 06:56:49.784992 2212400 logs.go:282] 0 containers: []
	W1219 06:56:49.785001 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:56:49.785007 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:56:49.785068 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:56:49.814949 2212400 cri.go:92] found id: ""
	I1219 06:56:49.814973 2212400 logs.go:282] 0 containers: []
	W1219 06:56:49.814983 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:56:49.814996 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:56:49.815007 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:56:49.843498 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:56:49.843527 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:56:49.859635 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:56:49.859666 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:56:49.890128 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:56:49.890162 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:56:49.951842 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:56:49.951877 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:56:50.030588 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:56:50.030618 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:56:50.030640 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:56:50.066110 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:56:50.066147 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:56:50.101607 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:56:50.101645 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:56:50.137869 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:56:50.137905 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:56:52.710388 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:56:52.720608 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:56:52.720677 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:56:52.747079 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:56:52.747102 2212400 cri.go:92] found id: ""
	I1219 06:56:52.747111 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:56:52.747167 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:52.750998 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:56:52.751071 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:56:52.780903 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:56:52.780926 2212400 cri.go:92] found id: ""
	I1219 06:56:52.780935 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:56:52.780991 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:52.784692 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:56:52.784783 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:56:52.809355 2212400 cri.go:92] found id: ""
	I1219 06:56:52.809378 2212400 logs.go:282] 0 containers: []
	W1219 06:56:52.809386 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:56:52.809392 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:56:52.809450 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:56:52.838043 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:56:52.838068 2212400 cri.go:92] found id: ""
	I1219 06:56:52.838077 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:56:52.838135 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:52.841978 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:56:52.842047 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:56:52.866687 2212400 cri.go:92] found id: ""
	I1219 06:56:52.866717 2212400 logs.go:282] 0 containers: []
	W1219 06:56:52.866726 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:56:52.866732 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:56:52.866794 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:56:52.892275 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:56:52.892298 2212400 cri.go:92] found id: ""
	I1219 06:56:52.892307 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:56:52.892371 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:52.896376 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:56:52.896451 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:56:52.921548 2212400 cri.go:92] found id: ""
	I1219 06:56:52.921571 2212400 logs.go:282] 0 containers: []
	W1219 06:56:52.921580 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:56:52.921586 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:56:52.921645 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:56:52.949657 2212400 cri.go:92] found id: ""
	I1219 06:56:52.949680 2212400 logs.go:282] 0 containers: []
	W1219 06:56:52.949688 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:56:52.949704 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:56:52.949714 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:56:53.008079 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:56:53.008116 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:56:53.024968 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:56:53.024999 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:56:53.063590 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:56:53.063624 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:56:53.097586 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:56:53.097622 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:56:53.129385 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:56:53.129421 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:56:53.177911 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:56:53.177938 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:56:53.256720 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:56:53.256739 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:56:53.256752 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:56:53.295144 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:56:53.295192 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:56:55.829539 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:56:55.840074 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:56:55.840146 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:56:55.867134 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:56:55.867158 2212400 cri.go:92] found id: ""
	I1219 06:56:55.867166 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:56:55.867225 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:55.871047 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:56:55.871123 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:56:55.895956 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:56:55.895978 2212400 cri.go:92] found id: ""
	I1219 06:56:55.895986 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:56:55.896047 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:55.899815 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:56:55.899896 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:56:55.924988 2212400 cri.go:92] found id: ""
	I1219 06:56:55.925013 2212400 logs.go:282] 0 containers: []
	W1219 06:56:55.925022 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:56:55.925028 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:56:55.925094 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:56:55.951551 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:56:55.951574 2212400 cri.go:92] found id: ""
	I1219 06:56:55.951583 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:56:55.951642 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:55.955449 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:56:55.955526 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:56:55.980170 2212400 cri.go:92] found id: ""
	I1219 06:56:55.980196 2212400 logs.go:282] 0 containers: []
	W1219 06:56:55.980205 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:56:55.980212 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:56:55.980272 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:56:56.008231 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:56:56.008258 2212400 cri.go:92] found id: ""
	I1219 06:56:56.008268 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:56:56.008356 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:56.012591 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:56:56.012679 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:56:56.039493 2212400 cri.go:92] found id: ""
	I1219 06:56:56.039519 2212400 logs.go:282] 0 containers: []
	W1219 06:56:56.039529 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:56:56.039535 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:56:56.039596 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:56:56.066207 2212400 cri.go:92] found id: ""
	I1219 06:56:56.066233 2212400 logs.go:282] 0 containers: []
	W1219 06:56:56.066243 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:56:56.066256 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:56:56.066310 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:56:56.132234 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:56:56.132254 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:56:56.132267 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:56:56.179182 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:56:56.179214 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:56:56.219213 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:56:56.219293 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:56:56.251348 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:56:56.251376 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:56:56.309849 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:56:56.309886 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:56:56.351115 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:56:56.351148 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:56:56.398778 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:56:56.398808 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:56:56.430441 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:56:56.430479 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:56:58.948281 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:56:58.958390 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:56:58.958457 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:56:58.987539 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:56:58.987560 2212400 cri.go:92] found id: ""
	I1219 06:56:58.987568 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:56:58.987625 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:58.991256 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:56:58.991325 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:56:59.018524 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:56:59.018545 2212400 cri.go:92] found id: ""
	I1219 06:56:59.018553 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:56:59.018610 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:59.022459 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:56:59.022531 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:56:59.047486 2212400 cri.go:92] found id: ""
	I1219 06:56:59.047511 2212400 logs.go:282] 0 containers: []
	W1219 06:56:59.047520 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:56:59.047526 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:56:59.047587 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:56:59.077752 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:56:59.077776 2212400 cri.go:92] found id: ""
	I1219 06:56:59.077784 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:56:59.077860 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:59.081639 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:56:59.081717 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:56:59.106260 2212400 cri.go:92] found id: ""
	I1219 06:56:59.106296 2212400 logs.go:282] 0 containers: []
	W1219 06:56:59.106307 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:56:59.106330 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:56:59.106411 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:56:59.135677 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:56:59.135701 2212400 cri.go:92] found id: ""
	I1219 06:56:59.135710 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:56:59.135775 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:56:59.139556 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:56:59.139642 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:56:59.176912 2212400 cri.go:92] found id: ""
	I1219 06:56:59.176938 2212400 logs.go:282] 0 containers: []
	W1219 06:56:59.176947 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:56:59.176980 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:56:59.177073 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:56:59.202913 2212400 cri.go:92] found id: ""
	I1219 06:56:59.202940 2212400 logs.go:282] 0 containers: []
	W1219 06:56:59.202949 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:56:59.202981 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:56:59.202996 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:56:59.265662 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:56:59.265699 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:56:59.282722 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:56:59.282753 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:56:59.348712 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:56:59.348741 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:56:59.348789 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:56:59.383391 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:56:59.383426 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:56:59.425892 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:56:59.425929 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:56:59.472883 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:56:59.472918 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:56:59.517338 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:56:59.517366 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:56:59.563199 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:56:59.563239 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:57:02.093818 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:57:02.112824 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:57:02.112900 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:57:02.145409 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:57:02.145436 2212400 cri.go:92] found id: ""
	I1219 06:57:02.145444 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:57:02.145503 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:02.150732 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:57:02.150810 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:57:02.193691 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:57:02.193718 2212400 cri.go:92] found id: ""
	I1219 06:57:02.193727 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:57:02.193786 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:02.198333 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:57:02.198414 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:57:02.226815 2212400 cri.go:92] found id: ""
	I1219 06:57:02.226837 2212400 logs.go:282] 0 containers: []
	W1219 06:57:02.226846 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:57:02.226852 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:57:02.226923 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:57:02.253525 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:57:02.253547 2212400 cri.go:92] found id: ""
	I1219 06:57:02.253556 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:57:02.253618 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:02.257684 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:57:02.257767 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:57:02.285479 2212400 cri.go:92] found id: ""
	I1219 06:57:02.285507 2212400 logs.go:282] 0 containers: []
	W1219 06:57:02.285517 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:57:02.285524 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:57:02.285587 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:57:02.315312 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:57:02.315338 2212400 cri.go:92] found id: ""
	I1219 06:57:02.315347 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:57:02.315410 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:02.319424 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:57:02.319500 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:57:02.345503 2212400 cri.go:92] found id: ""
	I1219 06:57:02.345530 2212400 logs.go:282] 0 containers: []
	W1219 06:57:02.345539 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:57:02.345546 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:57:02.345613 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:57:02.372328 2212400 cri.go:92] found id: ""
	I1219 06:57:02.372406 2212400 logs.go:282] 0 containers: []
	W1219 06:57:02.372428 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:57:02.372471 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:57:02.372499 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:57:02.433766 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:57:02.433804 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:57:02.466167 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:57:02.466204 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:57:02.508011 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:57:02.508048 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:57:02.542832 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:57:02.542879 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:57:02.560390 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:57:02.560449 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:57:02.627249 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:57:02.627272 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:57:02.627285 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:57:02.662282 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:57:02.662319 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:57:02.699027 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:57:02.699060 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:57:05.233038 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:57:05.243325 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:57:05.243399 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:57:05.272111 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:57:05.272136 2212400 cri.go:92] found id: ""
	I1219 06:57:05.272144 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:57:05.272214 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:05.275882 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:57:05.275955 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:57:05.302124 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:57:05.302210 2212400 cri.go:92] found id: ""
	I1219 06:57:05.302235 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:57:05.302319 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:05.306078 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:57:05.306222 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:57:05.330744 2212400 cri.go:92] found id: ""
	I1219 06:57:05.330813 2212400 logs.go:282] 0 containers: []
	W1219 06:57:05.330840 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:57:05.330860 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:57:05.330937 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:57:05.356574 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:57:05.356597 2212400 cri.go:92] found id: ""
	I1219 06:57:05.356605 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:57:05.356687 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:05.360478 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:57:05.360570 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:57:05.388121 2212400 cri.go:92] found id: ""
	I1219 06:57:05.388150 2212400 logs.go:282] 0 containers: []
	W1219 06:57:05.388160 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:57:05.388166 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:57:05.388269 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:57:05.414691 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:57:05.414766 2212400 cri.go:92] found id: ""
	I1219 06:57:05.414789 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:57:05.414865 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:05.418694 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:57:05.418775 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:57:05.444927 2212400 cri.go:92] found id: ""
	I1219 06:57:05.444956 2212400 logs.go:282] 0 containers: []
	W1219 06:57:05.444964 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:57:05.444971 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:57:05.445080 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:57:05.473566 2212400 cri.go:92] found id: ""
	I1219 06:57:05.473594 2212400 logs.go:282] 0 containers: []
	W1219 06:57:05.473604 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:57:05.473618 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:57:05.473630 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:57:05.510964 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:57:05.510996 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:57:05.540038 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:57:05.540077 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:57:05.572188 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:57:05.572221 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:57:05.632047 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:57:05.632083 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:57:05.648444 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:57:05.648475 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:57:05.684856 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:57:05.684889 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:57:05.750502 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:57:05.750522 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:57:05.750536 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:57:05.787090 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:57:05.787123 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:57:08.324142 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:57:08.336229 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:57:08.336305 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:57:08.362635 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:57:08.362663 2212400 cri.go:92] found id: ""
	I1219 06:57:08.362673 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:57:08.362736 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:08.366682 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:57:08.366757 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:57:08.392323 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:57:08.392347 2212400 cri.go:92] found id: ""
	I1219 06:57:08.392355 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:57:08.392412 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:08.396270 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:57:08.396342 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:57:08.422544 2212400 cri.go:92] found id: ""
	I1219 06:57:08.422569 2212400 logs.go:282] 0 containers: []
	W1219 06:57:08.422578 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:57:08.422585 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:57:08.422647 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:57:08.448578 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:57:08.448601 2212400 cri.go:92] found id: ""
	I1219 06:57:08.448610 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:57:08.448674 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:08.452364 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:57:08.452483 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:57:08.481698 2212400 cri.go:92] found id: ""
	I1219 06:57:08.481726 2212400 logs.go:282] 0 containers: []
	W1219 06:57:08.481735 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:57:08.481741 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:57:08.481802 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:57:08.507787 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:57:08.507813 2212400 cri.go:92] found id: ""
	I1219 06:57:08.507822 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:57:08.507888 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:08.511832 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:57:08.511905 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:57:08.539410 2212400 cri.go:92] found id: ""
	I1219 06:57:08.539437 2212400 logs.go:282] 0 containers: []
	W1219 06:57:08.539447 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:57:08.539454 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:57:08.539519 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:57:08.566065 2212400 cri.go:92] found id: ""
	I1219 06:57:08.566092 2212400 logs.go:282] 0 containers: []
	W1219 06:57:08.566101 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:57:08.566135 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:57:08.566152 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:57:08.629434 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:57:08.629473 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:57:08.698923 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:57:08.698945 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:57:08.698958 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:57:08.743934 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:57:08.743970 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:57:08.776550 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:57:08.776578 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:57:08.793004 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:57:08.793034 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:57:08.831735 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:57:08.831767 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:57:08.872398 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:57:08.872427 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:57:08.917768 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:57:08.917928 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:57:11.452699 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:57:11.463999 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:57:11.464074 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:57:11.494753 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:57:11.494777 2212400 cri.go:92] found id: ""
	I1219 06:57:11.494786 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:57:11.494852 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:11.498583 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:57:11.498653 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:57:11.524509 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:57:11.524531 2212400 cri.go:92] found id: ""
	I1219 06:57:11.524539 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:57:11.524604 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:11.528595 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:57:11.528673 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:57:11.553634 2212400 cri.go:92] found id: ""
	I1219 06:57:11.553660 2212400 logs.go:282] 0 containers: []
	W1219 06:57:11.553669 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:57:11.553676 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:57:11.553739 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:57:11.579243 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:57:11.579319 2212400 cri.go:92] found id: ""
	I1219 06:57:11.579351 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:57:11.579469 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:11.583646 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:57:11.583750 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:57:11.613446 2212400 cri.go:92] found id: ""
	I1219 06:57:11.613472 2212400 logs.go:282] 0 containers: []
	W1219 06:57:11.613481 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:57:11.613488 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:57:11.613575 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:57:11.639182 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:57:11.639281 2212400 cri.go:92] found id: ""
	I1219 06:57:11.639298 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:57:11.639363 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:11.643341 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:57:11.643416 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:57:11.669457 2212400 cri.go:92] found id: ""
	I1219 06:57:11.669483 2212400 logs.go:282] 0 containers: []
	W1219 06:57:11.669492 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:57:11.669499 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:57:11.669568 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:57:11.695298 2212400 cri.go:92] found id: ""
	I1219 06:57:11.695325 2212400 logs.go:282] 0 containers: []
	W1219 06:57:11.695334 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:57:11.695348 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:57:11.695359 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:57:11.737736 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:57:11.737768 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:57:11.773009 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:57:11.773086 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:57:11.803319 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:57:11.803351 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:57:11.865738 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:57:11.865773 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:57:11.901024 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:57:11.901069 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:57:11.939064 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:57:11.939099 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:57:11.979392 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:57:11.979418 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:57:11.996374 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:57:11.996402 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:57:12.067126 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:57:14.568332 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:57:14.578950 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:57:14.579027 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:57:14.604527 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:57:14.604547 2212400 cri.go:92] found id: ""
	I1219 06:57:14.604555 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:57:14.604611 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:14.608199 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:57:14.608270 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:57:14.633798 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:57:14.633823 2212400 cri.go:92] found id: ""
	I1219 06:57:14.633831 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:57:14.633887 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:14.637621 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:57:14.637699 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:57:14.663128 2212400 cri.go:92] found id: ""
	I1219 06:57:14.663150 2212400 logs.go:282] 0 containers: []
	W1219 06:57:14.663159 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:57:14.663166 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:57:14.663230 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:57:14.691795 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:57:14.691817 2212400 cri.go:92] found id: ""
	I1219 06:57:14.691826 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:57:14.691886 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:14.695564 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:57:14.695636 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:57:14.723483 2212400 cri.go:92] found id: ""
	I1219 06:57:14.723508 2212400 logs.go:282] 0 containers: []
	W1219 06:57:14.723517 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:57:14.723524 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:57:14.723583 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:57:14.749070 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:57:14.749093 2212400 cri.go:92] found id: ""
	I1219 06:57:14.749102 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:57:14.749161 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:14.752831 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:57:14.752905 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:57:14.777759 2212400 cri.go:92] found id: ""
	I1219 06:57:14.777782 2212400 logs.go:282] 0 containers: []
	W1219 06:57:14.777798 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:57:14.777805 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:57:14.777866 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:57:14.802988 2212400 cri.go:92] found id: ""
	I1219 06:57:14.803021 2212400 logs.go:282] 0 containers: []
	W1219 06:57:14.803031 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:57:14.803047 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:57:14.803062 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:57:14.862594 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:57:14.862633 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:57:14.878835 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:57:14.878866 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:57:14.970659 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:57:14.970678 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:57:14.970691 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:57:15.004972 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:57:15.005012 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:57:15.053738 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:57:15.053778 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:57:15.092714 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:57:15.092778 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:57:15.123604 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:57:15.123687 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:57:15.163695 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:57:15.163780 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:57:17.697021 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:57:17.707920 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:57:17.708001 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:57:17.735276 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:57:17.735298 2212400 cri.go:92] found id: ""
	I1219 06:57:17.735306 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:57:17.735367 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:17.739004 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:57:17.739075 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:57:17.762754 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:57:17.762776 2212400 cri.go:92] found id: ""
	I1219 06:57:17.762784 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:57:17.762841 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:17.766527 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:57:17.766606 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:57:17.794283 2212400 cri.go:92] found id: ""
	I1219 06:57:17.794308 2212400 logs.go:282] 0 containers: []
	W1219 06:57:17.794316 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:57:17.794323 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:57:17.794382 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:57:17.819609 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:57:17.819681 2212400 cri.go:92] found id: ""
	I1219 06:57:17.819704 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:57:17.819790 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:17.823547 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:57:17.823660 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:57:17.852803 2212400 cri.go:92] found id: ""
	I1219 06:57:17.852882 2212400 logs.go:282] 0 containers: []
	W1219 06:57:17.852903 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:57:17.852923 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:57:17.853016 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:57:17.878350 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:57:17.878373 2212400 cri.go:92] found id: ""
	I1219 06:57:17.878382 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:57:17.878459 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:17.882406 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:57:17.882485 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:57:17.924821 2212400 cri.go:92] found id: ""
	I1219 06:57:17.924847 2212400 logs.go:282] 0 containers: []
	W1219 06:57:17.924855 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:57:17.924862 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:57:17.924923 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:57:17.959214 2212400 cri.go:92] found id: ""
	I1219 06:57:17.959238 2212400 logs.go:282] 0 containers: []
	W1219 06:57:17.959247 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:57:17.959264 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:57:17.959276 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:57:17.997123 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:57:17.997154 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:57:18.037306 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:57:18.037339 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:57:18.067841 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:57:18.067920 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:57:18.088500 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:57:18.088580 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:57:18.129264 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:57:18.129297 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:57:18.164328 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:57:18.164358 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:57:18.207585 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:57:18.207613 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:57:18.267181 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:57:18.267216 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:57:18.331650 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:57:20.832785 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:57:20.843115 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:57:20.843191 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:57:20.868809 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:57:20.868833 2212400 cri.go:92] found id: ""
	I1219 06:57:20.868842 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:57:20.868899 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:20.872519 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:57:20.872589 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:57:20.899780 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:57:20.899854 2212400 cri.go:92] found id: ""
	I1219 06:57:20.899875 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:57:20.899958 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:20.904172 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:57:20.904286 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:57:20.930945 2212400 cri.go:92] found id: ""
	I1219 06:57:20.931012 2212400 logs.go:282] 0 containers: []
	W1219 06:57:20.931034 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:57:20.931055 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:57:20.931141 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:57:20.963653 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:57:20.963724 2212400 cri.go:92] found id: ""
	I1219 06:57:20.963758 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:57:20.963845 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:20.967626 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:57:20.967747 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:57:20.992745 2212400 cri.go:92] found id: ""
	I1219 06:57:20.992881 2212400 logs.go:282] 0 containers: []
	W1219 06:57:20.992906 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:57:20.992940 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:57:20.993021 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:57:21.020346 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:57:21.020421 2212400 cri.go:92] found id: ""
	I1219 06:57:21.020443 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:57:21.020529 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:21.024354 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:57:21.024480 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:57:21.049250 2212400 cri.go:92] found id: ""
	I1219 06:57:21.049276 2212400 logs.go:282] 0 containers: []
	W1219 06:57:21.049285 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:57:21.049291 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:57:21.049350 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:57:21.075432 2212400 cri.go:92] found id: ""
	I1219 06:57:21.075513 2212400 logs.go:282] 0 containers: []
	W1219 06:57:21.075550 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:57:21.075586 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:57:21.075618 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:57:21.109727 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:57:21.109758 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:57:21.139407 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:57:21.139445 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:57:21.169320 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:57:21.169350 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:57:21.185714 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:57:21.185741 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:57:21.248375 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:57:21.248401 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:57:21.248414 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:57:21.286199 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:57:21.286231 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:57:21.318925 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:57:21.318955 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:57:21.379806 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:57:21.379887 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:57:23.916626 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:57:23.928566 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:57:23.928642 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:57:23.959363 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:57:23.959386 2212400 cri.go:92] found id: ""
	I1219 06:57:23.959394 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:57:23.959458 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:23.964199 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:57:23.964276 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:57:23.993688 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:57:23.993712 2212400 cri.go:92] found id: ""
	I1219 06:57:23.993720 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:57:23.993780 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:23.997586 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:57:23.997660 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:57:24.024635 2212400 cri.go:92] found id: ""
	I1219 06:57:24.024662 2212400 logs.go:282] 0 containers: []
	W1219 06:57:24.024671 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:57:24.024678 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:57:24.024742 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:57:24.051277 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:57:24.051301 2212400 cri.go:92] found id: ""
	I1219 06:57:24.051310 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:57:24.051367 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:24.055091 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:57:24.055166 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:57:24.082330 2212400 cri.go:92] found id: ""
	I1219 06:57:24.082361 2212400 logs.go:282] 0 containers: []
	W1219 06:57:24.082370 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:57:24.082376 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:57:24.082483 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:57:24.112283 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:57:24.112304 2212400 cri.go:92] found id: ""
	I1219 06:57:24.112312 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:57:24.112371 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:24.116415 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:57:24.116488 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:57:24.143661 2212400 cri.go:92] found id: ""
	I1219 06:57:24.143685 2212400 logs.go:282] 0 containers: []
	W1219 06:57:24.143695 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:57:24.143701 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:57:24.143764 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:57:24.169345 2212400 cri.go:92] found id: ""
	I1219 06:57:24.169422 2212400 logs.go:282] 0 containers: []
	W1219 06:57:24.169439 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:57:24.169454 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:57:24.169465 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:57:24.228588 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:57:24.228624 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:57:24.244734 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:57:24.244783 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:57:24.305451 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:57:24.305475 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:57:24.305489 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:57:24.339721 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:57:24.339751 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:57:24.377194 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:57:24.377233 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:57:24.411381 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:57:24.411412 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:57:24.442460 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:57:24.442496 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:57:24.471249 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:57:24.471277 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:57:27.012910 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:57:27.024168 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:57:27.024243 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:57:27.048557 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:57:27.048583 2212400 cri.go:92] found id: ""
	I1219 06:57:27.048592 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:57:27.048650 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:27.052511 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:57:27.052584 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:57:27.078075 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:57:27.078099 2212400 cri.go:92] found id: ""
	I1219 06:57:27.078107 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:57:27.078164 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:27.081986 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:57:27.082058 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:57:27.110350 2212400 cri.go:92] found id: ""
	I1219 06:57:27.110375 2212400 logs.go:282] 0 containers: []
	W1219 06:57:27.110384 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:57:27.110391 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:57:27.110468 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:57:27.139836 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:57:27.139859 2212400 cri.go:92] found id: ""
	I1219 06:57:27.139867 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:57:27.139924 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:27.143564 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:57:27.143635 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:57:27.168793 2212400 cri.go:92] found id: ""
	I1219 06:57:27.168819 2212400 logs.go:282] 0 containers: []
	W1219 06:57:27.168828 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:57:27.168835 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:57:27.168897 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:57:27.193683 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:57:27.193708 2212400 cri.go:92] found id: ""
	I1219 06:57:27.193717 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:57:27.193795 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:27.197647 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:57:27.197734 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:57:27.221534 2212400 cri.go:92] found id: ""
	I1219 06:57:27.221557 2212400 logs.go:282] 0 containers: []
	W1219 06:57:27.221566 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:57:27.221572 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:57:27.221642 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:57:27.247360 2212400 cri.go:92] found id: ""
	I1219 06:57:27.247384 2212400 logs.go:282] 0 containers: []
	W1219 06:57:27.247392 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:57:27.247405 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:57:27.247416 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:57:27.306296 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:57:27.306327 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:57:27.375213 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:57:27.375245 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:57:27.375258 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:57:27.413634 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:57:27.413666 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:57:27.449341 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:57:27.449371 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:57:27.479051 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:57:27.479088 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:57:27.519479 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:57:27.519507 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:57:27.535823 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:57:27.535853 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:57:27.568246 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:57:27.568285 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:57:30.102710 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:57:30.114134 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:57:30.114246 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:57:30.141375 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:57:30.141402 2212400 cri.go:92] found id: ""
	I1219 06:57:30.141411 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:57:30.141473 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:30.145941 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:57:30.146014 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:57:30.172960 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:57:30.173035 2212400 cri.go:92] found id: ""
	I1219 06:57:30.173057 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:57:30.173143 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:30.177127 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:57:30.177203 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:57:30.202899 2212400 cri.go:92] found id: ""
	I1219 06:57:30.202925 2212400 logs.go:282] 0 containers: []
	W1219 06:57:30.202934 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:57:30.202941 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:57:30.203005 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:57:30.232547 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:57:30.232571 2212400 cri.go:92] found id: ""
	I1219 06:57:30.232580 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:57:30.232637 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:30.236421 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:57:30.236494 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:57:30.262029 2212400 cri.go:92] found id: ""
	I1219 06:57:30.262057 2212400 logs.go:282] 0 containers: []
	W1219 06:57:30.262067 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:57:30.262073 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:57:30.262177 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:57:30.287937 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:57:30.287960 2212400 cri.go:92] found id: ""
	I1219 06:57:30.287969 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:57:30.288028 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:30.291935 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:57:30.292011 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:57:30.320889 2212400 cri.go:92] found id: ""
	I1219 06:57:30.320912 2212400 logs.go:282] 0 containers: []
	W1219 06:57:30.320921 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:57:30.320928 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:57:30.321000 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:57:30.345845 2212400 cri.go:92] found id: ""
	I1219 06:57:30.345924 2212400 logs.go:282] 0 containers: []
	W1219 06:57:30.345940 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:57:30.345954 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:57:30.345966 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:57:30.400979 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:57:30.401015 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:57:30.448329 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:57:30.448359 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:57:30.478949 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:57:30.478981 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:57:30.495220 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:57:30.495248 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:57:30.527653 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:57:30.527689 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:57:30.560107 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:57:30.560138 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:57:30.592501 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:57:30.592529 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:57:30.651689 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:57:30.651768 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:57:30.738247 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:57:33.239249 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:57:33.253787 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:57:33.253853 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:57:33.290019 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:57:33.290039 2212400 cri.go:92] found id: ""
	I1219 06:57:33.290047 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:57:33.290104 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:33.294873 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:57:33.294946 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:57:33.341326 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:57:33.341350 2212400 cri.go:92] found id: ""
	I1219 06:57:33.341370 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:57:33.341432 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:33.345867 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:57:33.345942 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:57:33.397523 2212400 cri.go:92] found id: ""
	I1219 06:57:33.397548 2212400 logs.go:282] 0 containers: []
	W1219 06:57:33.397557 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:57:33.397563 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:57:33.397625 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:57:33.430865 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:57:33.430890 2212400 cri.go:92] found id: ""
	I1219 06:57:33.430899 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:57:33.430956 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:33.435565 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:57:33.435639 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:57:33.465076 2212400 cri.go:92] found id: ""
	I1219 06:57:33.465101 2212400 logs.go:282] 0 containers: []
	W1219 06:57:33.465110 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:57:33.465116 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:57:33.465188 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:57:33.499484 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:57:33.499507 2212400 cri.go:92] found id: ""
	I1219 06:57:33.499516 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:57:33.499589 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:33.503731 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:57:33.503818 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:57:33.535234 2212400 cri.go:92] found id: ""
	I1219 06:57:33.535262 2212400 logs.go:282] 0 containers: []
	W1219 06:57:33.535271 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:57:33.535278 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:57:33.535339 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:57:33.581070 2212400 cri.go:92] found id: ""
	I1219 06:57:33.581144 2212400 logs.go:282] 0 containers: []
	W1219 06:57:33.581167 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:57:33.581209 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:57:33.581236 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:57:33.660123 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:57:33.660208 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:57:33.686242 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:57:33.686269 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:57:33.755386 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:57:33.755406 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:57:33.755418 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:57:33.792355 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:57:33.792389 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:57:33.828114 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:57:33.828151 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:57:33.868775 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:57:33.868853 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:57:33.911951 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:57:33.911980 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:57:33.941750 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:57:33.941786 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:57:36.471964 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:57:36.482596 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:57:36.482667 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:57:36.517425 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:57:36.517445 2212400 cri.go:92] found id: ""
	I1219 06:57:36.517453 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:57:36.517510 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:36.523260 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:57:36.523330 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:57:36.563265 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:57:36.563284 2212400 cri.go:92] found id: ""
	I1219 06:57:36.563292 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:57:36.563348 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:36.567704 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:57:36.567833 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:57:36.602279 2212400 cri.go:92] found id: ""
	I1219 06:57:36.602357 2212400 logs.go:282] 0 containers: []
	W1219 06:57:36.602381 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:57:36.602402 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:57:36.602521 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:57:36.635414 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:57:36.635511 2212400 cri.go:92] found id: ""
	I1219 06:57:36.635578 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:57:36.635674 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:36.640667 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:57:36.640817 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:57:36.696945 2212400 cri.go:92] found id: ""
	I1219 06:57:36.697037 2212400 logs.go:282] 0 containers: []
	W1219 06:57:36.697067 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:57:36.697091 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:57:36.697205 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:57:36.753721 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:57:36.753798 2212400 cri.go:92] found id: ""
	I1219 06:57:36.753821 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:57:36.753908 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:36.761393 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:57:36.761474 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:57:36.794986 2212400 cri.go:92] found id: ""
	I1219 06:57:36.795062 2212400 logs.go:282] 0 containers: []
	W1219 06:57:36.795085 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:57:36.795105 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:57:36.795195 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:57:36.828632 2212400 cri.go:92] found id: ""
	I1219 06:57:36.828658 2212400 logs.go:282] 0 containers: []
	W1219 06:57:36.828667 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:57:36.828680 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:57:36.828703 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:57:36.861434 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:57:36.861469 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:57:36.901787 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:57:36.901825 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:57:36.965529 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:57:36.965567 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:57:37.013690 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:57:37.013730 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:57:37.063538 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:57:37.063618 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:57:37.109012 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:57:37.109044 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:57:37.181916 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:57:37.181998 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:57:37.198340 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:57:37.198369 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:57:37.258945 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:57:39.759223 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:57:39.772016 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:57:39.772091 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:57:39.807620 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:57:39.807709 2212400 cri.go:92] found id: ""
	I1219 06:57:39.807732 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:57:39.807827 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:39.812643 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:57:39.812713 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:57:39.850225 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:57:39.850246 2212400 cri.go:92] found id: ""
	I1219 06:57:39.850255 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:57:39.850309 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:39.854082 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:57:39.854164 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:57:39.891776 2212400 cri.go:92] found id: ""
	I1219 06:57:39.891798 2212400 logs.go:282] 0 containers: []
	W1219 06:57:39.891806 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:57:39.891812 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:57:39.891865 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:57:39.926289 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:57:39.926314 2212400 cri.go:92] found id: ""
	I1219 06:57:39.926323 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:57:39.926391 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:39.930597 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:57:39.930675 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:57:39.957046 2212400 cri.go:92] found id: ""
	I1219 06:57:39.957072 2212400 logs.go:282] 0 containers: []
	W1219 06:57:39.957081 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:57:39.957088 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:57:39.957153 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:57:39.986692 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:57:39.986721 2212400 cri.go:92] found id: ""
	I1219 06:57:39.986730 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:57:39.986794 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:39.993270 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:57:39.993358 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:57:40.034129 2212400 cri.go:92] found id: ""
	I1219 06:57:40.034158 2212400 logs.go:282] 0 containers: []
	W1219 06:57:40.034168 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:57:40.034174 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:57:40.034252 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:57:40.068825 2212400 cri.go:92] found id: ""
	I1219 06:57:40.068852 2212400 logs.go:282] 0 containers: []
	W1219 06:57:40.068861 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:57:40.068878 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:57:40.068891 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:57:40.181534 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:57:40.181553 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:57:40.181567 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:57:40.235259 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:57:40.235362 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:57:40.283131 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:57:40.283803 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:57:40.324122 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:57:40.324196 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:57:40.404674 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:57:40.404716 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:57:40.487341 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:57:40.491619 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:57:40.547938 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:57:40.547975 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:57:40.581233 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:57:40.581271 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:57:43.101715 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:57:43.113420 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:57:43.113499 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:57:43.144899 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:57:43.144924 2212400 cri.go:92] found id: ""
	I1219 06:57:43.144933 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:57:43.144993 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:43.149439 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:57:43.149511 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:57:43.175128 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:57:43.175148 2212400 cri.go:92] found id: ""
	I1219 06:57:43.175156 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:57:43.175212 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:43.178853 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:57:43.178931 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:57:43.207263 2212400 cri.go:92] found id: ""
	I1219 06:57:43.207290 2212400 logs.go:282] 0 containers: []
	W1219 06:57:43.207300 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:57:43.207307 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:57:43.207367 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:57:43.234482 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:57:43.234503 2212400 cri.go:92] found id: ""
	I1219 06:57:43.234511 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:57:43.234569 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:43.238211 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:57:43.238332 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:57:43.264068 2212400 cri.go:92] found id: ""
	I1219 06:57:43.264091 2212400 logs.go:282] 0 containers: []
	W1219 06:57:43.264099 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:57:43.264106 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:57:43.264170 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:57:43.288438 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:57:43.288461 2212400 cri.go:92] found id: ""
	I1219 06:57:43.288469 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:57:43.288528 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:43.292558 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:57:43.292702 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:57:43.317185 2212400 cri.go:92] found id: ""
	I1219 06:57:43.317212 2212400 logs.go:282] 0 containers: []
	W1219 06:57:43.317221 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:57:43.317228 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:57:43.317339 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:57:43.350809 2212400 cri.go:92] found id: ""
	I1219 06:57:43.350836 2212400 logs.go:282] 0 containers: []
	W1219 06:57:43.350845 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:57:43.350883 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:57:43.350905 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:57:43.387063 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:57:43.387139 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:57:43.510860 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:57:43.510943 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:57:43.533653 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:57:43.533733 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:57:43.628312 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:57:43.628331 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:57:43.628345 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:57:43.666006 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:57:43.666042 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:57:43.714203 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:57:43.714281 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:57:43.759453 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:57:43.759532 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:57:43.794419 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:57:43.794506 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:57:46.362144 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:57:46.373306 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:57:46.373376 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:57:46.410065 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:57:46.410087 2212400 cri.go:92] found id: ""
	I1219 06:57:46.410096 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:57:46.410153 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:46.415375 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:57:46.415455 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:57:46.456030 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:57:46.456052 2212400 cri.go:92] found id: ""
	I1219 06:57:46.456061 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:57:46.456117 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:46.461962 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:57:46.462037 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:57:46.491182 2212400 cri.go:92] found id: ""
	I1219 06:57:46.491211 2212400 logs.go:282] 0 containers: []
	W1219 06:57:46.491220 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:57:46.491227 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:57:46.491286 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:57:46.516552 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:57:46.516577 2212400 cri.go:92] found id: ""
	I1219 06:57:46.516595 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:57:46.516671 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:46.520523 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:57:46.520597 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:57:46.547074 2212400 cri.go:92] found id: ""
	I1219 06:57:46.547101 2212400 logs.go:282] 0 containers: []
	W1219 06:57:46.547111 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:57:46.547118 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:57:46.547179 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:57:46.571946 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:57:46.571971 2212400 cri.go:92] found id: ""
	I1219 06:57:46.571980 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:57:46.572036 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:46.575725 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:57:46.575798 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:57:46.601397 2212400 cri.go:92] found id: ""
	I1219 06:57:46.601421 2212400 logs.go:282] 0 containers: []
	W1219 06:57:46.601431 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:57:46.601438 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:57:46.601504 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:57:46.626914 2212400 cri.go:92] found id: ""
	I1219 06:57:46.626942 2212400 logs.go:282] 0 containers: []
	W1219 06:57:46.626952 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:57:46.626969 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:57:46.626982 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:57:46.693203 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:57:46.693222 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:57:46.693236 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:57:46.728939 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:57:46.728972 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:57:46.762181 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:57:46.762215 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:57:46.800592 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:57:46.800625 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:57:46.829911 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:57:46.829941 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:57:46.889346 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:57:46.889386 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:57:46.906503 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:57:46.906532 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:57:46.955654 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:57:46.955753 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:57:49.490856 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:57:49.501096 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:57:49.501168 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:57:49.531332 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:57:49.531352 2212400 cri.go:92] found id: ""
	I1219 06:57:49.531359 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:57:49.531421 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:49.535262 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:57:49.535346 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:57:49.564067 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:57:49.564090 2212400 cri.go:92] found id: ""
	I1219 06:57:49.564098 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:57:49.564161 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:49.567890 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:57:49.567965 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:57:49.593273 2212400 cri.go:92] found id: ""
	I1219 06:57:49.593297 2212400 logs.go:282] 0 containers: []
	W1219 06:57:49.593306 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:57:49.593313 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:57:49.593373 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:57:49.619702 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:57:49.619725 2212400 cri.go:92] found id: ""
	I1219 06:57:49.619735 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:57:49.619793 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:49.625984 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:57:49.626062 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:57:49.655510 2212400 cri.go:92] found id: ""
	I1219 06:57:49.655535 2212400 logs.go:282] 0 containers: []
	W1219 06:57:49.655543 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:57:49.655550 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:57:49.655615 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:57:49.681038 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:57:49.681070 2212400 cri.go:92] found id: ""
	I1219 06:57:49.681080 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:57:49.681148 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:49.684927 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:57:49.685003 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:57:49.715362 2212400 cri.go:92] found id: ""
	I1219 06:57:49.715444 2212400 logs.go:282] 0 containers: []
	W1219 06:57:49.715466 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:57:49.715485 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:57:49.715585 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:57:49.744959 2212400 cri.go:92] found id: ""
	I1219 06:57:49.744983 2212400 logs.go:282] 0 containers: []
	W1219 06:57:49.744992 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:57:49.745006 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:57:49.745018 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:57:49.803688 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:57:49.803725 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:57:49.835535 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:57:49.835568 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:57:49.865696 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:57:49.865723 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:57:49.882477 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:57:49.882506 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:57:49.949324 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:57:49.949347 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:57:49.949361 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:57:49.998239 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:57:49.998272 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:57:50.048649 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:57:50.048682 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:57:50.084414 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:57:50.084451 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:57:52.618508 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:57:52.628611 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:57:52.628687 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:57:52.653668 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:57:52.653691 2212400 cri.go:92] found id: ""
	I1219 06:57:52.653699 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:57:52.653760 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:52.657545 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:57:52.657619 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:57:52.682778 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:57:52.682805 2212400 cri.go:92] found id: ""
	I1219 06:57:52.682815 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:57:52.682876 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:52.686669 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:57:52.686785 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:57:52.714041 2212400 cri.go:92] found id: ""
	I1219 06:57:52.714067 2212400 logs.go:282] 0 containers: []
	W1219 06:57:52.714076 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:57:52.714083 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:57:52.714145 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:57:52.740493 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:57:52.740516 2212400 cri.go:92] found id: ""
	I1219 06:57:52.740525 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:57:52.740585 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:52.744506 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:57:52.744612 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:57:52.769253 2212400 cri.go:92] found id: ""
	I1219 06:57:52.769277 2212400 logs.go:282] 0 containers: []
	W1219 06:57:52.769287 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:57:52.769294 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:57:52.769393 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:57:52.798546 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:57:52.798570 2212400 cri.go:92] found id: ""
	I1219 06:57:52.798580 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:57:52.798642 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:52.802455 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:57:52.802528 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:57:52.827591 2212400 cri.go:92] found id: ""
	I1219 06:57:52.827618 2212400 logs.go:282] 0 containers: []
	W1219 06:57:52.827628 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:57:52.827634 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:57:52.827697 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:57:52.852554 2212400 cri.go:92] found id: ""
	I1219 06:57:52.852581 2212400 logs.go:282] 0 containers: []
	W1219 06:57:52.852590 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:57:52.852606 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:57:52.852617 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:57:52.911075 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:57:52.911111 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:57:52.928288 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:57:52.928323 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:57:52.996858 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:57:52.996877 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:57:52.996892 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:57:53.034439 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:57:53.034469 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:57:53.071709 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:57:53.071739 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:57:53.115635 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:57:53.115663 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:57:53.153387 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:57:53.153428 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:57:53.207050 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:57:53.207079 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:57:55.743182 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:57:55.754049 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:57:55.754130 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:57:55.784567 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:57:55.784586 2212400 cri.go:92] found id: ""
	I1219 06:57:55.784594 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:57:55.784652 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:55.788435 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:57:55.788510 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:57:55.821884 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:57:55.821907 2212400 cri.go:92] found id: ""
	I1219 06:57:55.821915 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:57:55.821981 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:55.825928 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:57:55.826004 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:57:55.854824 2212400 cri.go:92] found id: ""
	I1219 06:57:55.854863 2212400 logs.go:282] 0 containers: []
	W1219 06:57:55.854873 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:57:55.854881 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:57:55.854945 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:57:55.880824 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:57:55.880847 2212400 cri.go:92] found id: ""
	I1219 06:57:55.880855 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:57:55.880914 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:55.884749 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:57:55.884854 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:57:55.911025 2212400 cri.go:92] found id: ""
	I1219 06:57:55.911048 2212400 logs.go:282] 0 containers: []
	W1219 06:57:55.911057 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:57:55.911063 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:57:55.911126 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:57:55.939592 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:57:55.939613 2212400 cri.go:92] found id: ""
	I1219 06:57:55.939622 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:57:55.939682 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:55.943572 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:57:55.943707 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:57:55.973471 2212400 cri.go:92] found id: ""
	I1219 06:57:55.973496 2212400 logs.go:282] 0 containers: []
	W1219 06:57:55.973505 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:57:55.973512 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:57:55.973595 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:57:55.998532 2212400 cri.go:92] found id: ""
	I1219 06:57:55.998558 2212400 logs.go:282] 0 containers: []
	W1219 06:57:55.998568 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:57:55.998583 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:57:55.998595 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:57:56.039338 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:57:56.039371 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:57:56.070035 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:57:56.070071 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:57:56.087510 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:57:56.087540 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:57:56.123492 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:57:56.123527 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:57:56.158843 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:57:56.158878 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:57:56.202998 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:57:56.203033 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:57:56.241366 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:57:56.241454 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:57:56.306436 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:57:56.306476 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:57:56.381376 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:57:58.881922 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:57:58.892103 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:57:58.892171 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:57:58.917215 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:57:58.917236 2212400 cri.go:92] found id: ""
	I1219 06:57:58.917244 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:57:58.917300 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:58.920901 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:57:58.920981 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:57:58.953976 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:57:58.953997 2212400 cri.go:92] found id: ""
	I1219 06:57:58.954006 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:57:58.954065 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:58.957782 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:57:58.957860 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:57:58.982854 2212400 cri.go:92] found id: ""
	I1219 06:57:58.982880 2212400 logs.go:282] 0 containers: []
	W1219 06:57:58.982888 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:57:58.982895 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:57:58.982961 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:57:59.017280 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:57:59.017303 2212400 cri.go:92] found id: ""
	I1219 06:57:59.017312 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:57:59.017370 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:59.021223 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:57:59.021297 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:57:59.045989 2212400 cri.go:92] found id: ""
	I1219 06:57:59.046014 2212400 logs.go:282] 0 containers: []
	W1219 06:57:59.046023 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:57:59.046030 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:57:59.046090 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:57:59.070314 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:57:59.070353 2212400 cri.go:92] found id: ""
	I1219 06:57:59.070362 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:57:59.070427 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:57:59.074058 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:57:59.074130 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:57:59.103209 2212400 cri.go:92] found id: ""
	I1219 06:57:59.103234 2212400 logs.go:282] 0 containers: []
	W1219 06:57:59.103243 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:57:59.103249 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:57:59.103310 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:57:59.127767 2212400 cri.go:92] found id: ""
	I1219 06:57:59.127792 2212400 logs.go:282] 0 containers: []
	W1219 06:57:59.127802 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:57:59.127816 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:57:59.127827 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:57:59.190782 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:57:59.190822 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:57:59.231911 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:57:59.231946 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:57:59.249030 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:57:59.249060 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:57:59.311442 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:57:59.311461 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:57:59.311474 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:57:59.345832 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:57:59.345864 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:57:59.385344 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:57:59.385379 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:57:59.423434 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:57:59.423468 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:57:59.454905 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:57:59.454941 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:58:01.983950 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:58:01.994255 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:58:01.994330 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:58:02.021732 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:58:02.021759 2212400 cri.go:92] found id: ""
	I1219 06:58:02.021768 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:58:02.021828 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:02.025745 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:58:02.025825 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:58:02.051692 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:58:02.051717 2212400 cri.go:92] found id: ""
	I1219 06:58:02.051726 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:58:02.051782 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:02.055635 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:58:02.055716 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:58:02.082640 2212400 cri.go:92] found id: ""
	I1219 06:58:02.082666 2212400 logs.go:282] 0 containers: []
	W1219 06:58:02.082675 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:58:02.082682 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:58:02.082748 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:58:02.110176 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:58:02.110204 2212400 cri.go:92] found id: ""
	I1219 06:58:02.110216 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:58:02.110280 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:02.114173 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:58:02.114255 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:58:02.140664 2212400 cri.go:92] found id: ""
	I1219 06:58:02.140686 2212400 logs.go:282] 0 containers: []
	W1219 06:58:02.140695 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:58:02.140701 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:58:02.140790 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:58:02.168323 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:58:02.168347 2212400 cri.go:92] found id: ""
	I1219 06:58:02.168364 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:58:02.168422 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:02.173028 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:58:02.173101 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:58:02.205593 2212400 cri.go:92] found id: ""
	I1219 06:58:02.205619 2212400 logs.go:282] 0 containers: []
	W1219 06:58:02.205628 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:58:02.205634 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:58:02.205696 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:58:02.237096 2212400 cri.go:92] found id: ""
	I1219 06:58:02.237123 2212400 logs.go:282] 0 containers: []
	W1219 06:58:02.237133 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:58:02.237149 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:58:02.237162 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:58:02.277442 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:58:02.277473 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:58:02.306730 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:58:02.306767 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:58:02.323688 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:58:02.323766 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:58:02.356344 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:58:02.356377 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:58:02.392073 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:58:02.392103 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:58:02.421507 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:58:02.421538 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:58:02.479828 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:58:02.479867 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:58:02.548032 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:58:02.548057 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:58:02.548072 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:58:05.083636 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:58:05.094070 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:58:05.094158 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:58:05.120276 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:58:05.120299 2212400 cri.go:92] found id: ""
	I1219 06:58:05.120307 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:58:05.120371 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:05.124469 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:58:05.124544 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:58:05.155861 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:58:05.155883 2212400 cri.go:92] found id: ""
	I1219 06:58:05.155892 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:58:05.155948 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:05.160488 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:58:05.160559 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:58:05.203215 2212400 cri.go:92] found id: ""
	I1219 06:58:05.203236 2212400 logs.go:282] 0 containers: []
	W1219 06:58:05.203245 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:58:05.203251 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:58:05.203309 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:58:05.237859 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:58:05.237883 2212400 cri.go:92] found id: ""
	I1219 06:58:05.237891 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:58:05.237949 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:05.241911 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:58:05.241991 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:58:05.266568 2212400 cri.go:92] found id: ""
	I1219 06:58:05.266594 2212400 logs.go:282] 0 containers: []
	W1219 06:58:05.266603 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:58:05.266610 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:58:05.266718 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:58:05.292233 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:58:05.292257 2212400 cri.go:92] found id: ""
	I1219 06:58:05.292265 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:58:05.292323 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:05.296061 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:58:05.296140 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:58:05.323458 2212400 cri.go:92] found id: ""
	I1219 06:58:05.323483 2212400 logs.go:282] 0 containers: []
	W1219 06:58:05.323493 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:58:05.323499 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:58:05.323562 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:58:05.351801 2212400 cri.go:92] found id: ""
	I1219 06:58:05.351827 2212400 logs.go:282] 0 containers: []
	W1219 06:58:05.351836 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:58:05.351849 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:58:05.351861 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:58:05.419911 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:58:05.419933 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:58:05.419945 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:58:05.452320 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:58:05.452351 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:58:05.491683 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:58:05.491712 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:58:05.520963 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:58:05.520996 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:58:05.549591 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:58:05.549621 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:58:05.567818 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:58:05.567847 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:58:05.603329 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:58:05.603359 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:58:05.641662 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:58:05.641694 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:58:08.201993 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:58:08.218098 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:58:08.218191 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:58:08.264974 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:58:08.264998 2212400 cri.go:92] found id: ""
	I1219 06:58:08.265006 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:58:08.265063 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:08.269027 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:58:08.269103 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:58:08.305280 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:58:08.305304 2212400 cri.go:92] found id: ""
	I1219 06:58:08.305325 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:58:08.305390 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:08.309479 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:58:08.309553 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:58:08.336331 2212400 cri.go:92] found id: ""
	I1219 06:58:08.336354 2212400 logs.go:282] 0 containers: []
	W1219 06:58:08.336363 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:58:08.336369 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:58:08.336431 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:58:08.375376 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:58:08.375398 2212400 cri.go:92] found id: ""
	I1219 06:58:08.375406 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:58:08.375464 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:08.383692 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:58:08.383765 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:58:08.424565 2212400 cri.go:92] found id: ""
	I1219 06:58:08.424588 2212400 logs.go:282] 0 containers: []
	W1219 06:58:08.424597 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:58:08.424604 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:58:08.424669 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:58:08.471793 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:58:08.471819 2212400 cri.go:92] found id: ""
	I1219 06:58:08.471828 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:58:08.471884 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:08.476160 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:58:08.476244 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:58:08.513066 2212400 cri.go:92] found id: ""
	I1219 06:58:08.513089 2212400 logs.go:282] 0 containers: []
	W1219 06:58:08.513098 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:58:08.513104 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:58:08.513164 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:58:08.543214 2212400 cri.go:92] found id: ""
	I1219 06:58:08.543241 2212400 logs.go:282] 0 containers: []
	W1219 06:58:08.543261 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:58:08.543276 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:58:08.543289 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:58:08.562526 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:58:08.562554 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:58:08.643313 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:58:08.643333 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:58:08.643346 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:58:08.680715 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:58:08.680746 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:58:08.715884 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:58:08.715912 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:58:08.787621 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:58:08.787662 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:58:08.848991 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:58:08.849027 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:58:08.904724 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:58:08.904883 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:58:08.987615 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:58:08.987646 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:58:11.524938 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:58:11.535159 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:58:11.535235 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:58:11.559819 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:58:11.559840 2212400 cri.go:92] found id: ""
	I1219 06:58:11.559847 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:58:11.559907 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:11.563649 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:58:11.563721 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:58:11.590653 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:58:11.590676 2212400 cri.go:92] found id: ""
	I1219 06:58:11.590684 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:58:11.590740 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:11.594516 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:58:11.594592 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:58:11.626427 2212400 cri.go:92] found id: ""
	I1219 06:58:11.626453 2212400 logs.go:282] 0 containers: []
	W1219 06:58:11.626463 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:58:11.626469 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:58:11.626533 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:58:11.654939 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:58:11.654964 2212400 cri.go:92] found id: ""
	I1219 06:58:11.654973 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:58:11.655031 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:11.658639 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:58:11.658717 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:58:11.684727 2212400 cri.go:92] found id: ""
	I1219 06:58:11.684751 2212400 logs.go:282] 0 containers: []
	W1219 06:58:11.684797 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:58:11.684804 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:58:11.684865 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:58:11.711140 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:58:11.711163 2212400 cri.go:92] found id: ""
	I1219 06:58:11.711176 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:58:11.711253 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:11.715008 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:58:11.715078 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:58:11.740204 2212400 cri.go:92] found id: ""
	I1219 06:58:11.740228 2212400 logs.go:282] 0 containers: []
	W1219 06:58:11.740236 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:58:11.740243 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:58:11.740306 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:58:11.769977 2212400 cri.go:92] found id: ""
	I1219 06:58:11.770053 2212400 logs.go:282] 0 containers: []
	W1219 06:58:11.770069 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:58:11.770083 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:58:11.770095 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:58:11.805227 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:58:11.805264 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:58:11.840180 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:58:11.840211 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:58:11.915618 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:58:11.915638 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:58:11.915652 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:58:11.958506 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:58:11.958593 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:58:12.014548 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:58:12.014593 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:58:12.046608 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:58:12.046642 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:58:12.074794 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:58:12.074822 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:58:12.135503 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:58:12.135544 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:58:14.653774 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:58:14.664465 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:58:14.664525 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:58:14.692216 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:58:14.692234 2212400 cri.go:92] found id: ""
	I1219 06:58:14.692241 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:58:14.692329 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:14.696397 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:58:14.696468 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:58:14.735705 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:58:14.735723 2212400 cri.go:92] found id: ""
	I1219 06:58:14.735731 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:58:14.735784 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:14.739795 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:58:14.739910 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:58:14.774126 2212400 cri.go:92] found id: ""
	I1219 06:58:14.774208 2212400 logs.go:282] 0 containers: []
	W1219 06:58:14.774231 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:58:14.774253 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:58:14.774344 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:58:14.807236 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:58:14.807307 2212400 cri.go:92] found id: ""
	I1219 06:58:14.807337 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:58:14.807416 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:14.811062 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:58:14.811193 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:58:14.847027 2212400 cri.go:92] found id: ""
	I1219 06:58:14.847098 2212400 logs.go:282] 0 containers: []
	W1219 06:58:14.847121 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:58:14.847142 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:58:14.847226 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:58:14.883430 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:58:14.883500 2212400 cri.go:92] found id: ""
	I1219 06:58:14.883524 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:58:14.883610 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:14.887569 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:58:14.887700 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:58:14.981919 2212400 cri.go:92] found id: ""
	I1219 06:58:14.981995 2212400 logs.go:282] 0 containers: []
	W1219 06:58:14.982019 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:58:14.982040 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:58:14.982128 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:58:15.019721 2212400 cri.go:92] found id: ""
	I1219 06:58:15.019820 2212400 logs.go:282] 0 containers: []
	W1219 06:58:15.019844 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:58:15.019879 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:58:15.019923 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:58:15.066623 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:58:15.066707 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:58:15.128402 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:58:15.128482 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:58:15.150398 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:58:15.150480 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:58:15.202658 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:58:15.202733 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:58:15.268704 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:58:15.268818 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:58:15.334379 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:58:15.334465 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:58:15.374803 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:58:15.374882 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:58:15.444051 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:58:15.444142 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:58:15.538507 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:58:18.038970 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:58:18.049835 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:58:18.049912 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:58:18.081324 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:58:18.081347 2212400 cri.go:92] found id: ""
	I1219 06:58:18.081355 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:58:18.081414 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:18.085247 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:58:18.085369 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:58:18.114750 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:58:18.114773 2212400 cri.go:92] found id: ""
	I1219 06:58:18.114781 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:58:18.114838 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:18.118639 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:58:18.118716 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:58:18.144483 2212400 cri.go:92] found id: ""
	I1219 06:58:18.144508 2212400 logs.go:282] 0 containers: []
	W1219 06:58:18.144517 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:58:18.144524 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:58:18.144583 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:58:18.170155 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:58:18.170198 2212400 cri.go:92] found id: ""
	I1219 06:58:18.170207 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:58:18.170267 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:18.174055 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:58:18.174128 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:58:18.198583 2212400 cri.go:92] found id: ""
	I1219 06:58:18.198606 2212400 logs.go:282] 0 containers: []
	W1219 06:58:18.198615 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:58:18.198622 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:58:18.198691 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:58:18.228506 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:58:18.228575 2212400 cri.go:92] found id: ""
	I1219 06:58:18.228588 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:58:18.228651 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:18.232511 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:58:18.232601 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:58:18.257135 2212400 cri.go:92] found id: ""
	I1219 06:58:18.257159 2212400 logs.go:282] 0 containers: []
	W1219 06:58:18.257168 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:58:18.257174 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:58:18.257265 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:58:18.287031 2212400 cri.go:92] found id: ""
	I1219 06:58:18.287057 2212400 logs.go:282] 0 containers: []
	W1219 06:58:18.287065 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:58:18.287079 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:58:18.287120 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:58:18.354003 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:58:18.354026 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:58:18.354040 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:58:18.394897 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:58:18.394992 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:58:18.424726 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:58:18.424777 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:58:18.486900 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:58:18.486934 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:58:18.503367 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:58:18.503399 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:58:18.536024 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:58:18.536055 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:58:18.577616 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:58:18.577658 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:58:18.615518 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:58:18.615552 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:58:21.145937 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:58:21.156027 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:58:21.156094 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:58:21.181095 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:58:21.181119 2212400 cri.go:92] found id: ""
	I1219 06:58:21.181128 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:58:21.181202 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:21.185093 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:58:21.185214 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:58:21.208630 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:58:21.208650 2212400 cri.go:92] found id: ""
	I1219 06:58:21.208658 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:58:21.208714 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:21.212406 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:58:21.212480 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:58:21.236809 2212400 cri.go:92] found id: ""
	I1219 06:58:21.236832 2212400 logs.go:282] 0 containers: []
	W1219 06:58:21.236840 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:58:21.236847 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:58:21.236909 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:58:21.262047 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:58:21.262070 2212400 cri.go:92] found id: ""
	I1219 06:58:21.262078 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:58:21.262134 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:21.265856 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:58:21.265922 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:58:21.292274 2212400 cri.go:92] found id: ""
	I1219 06:58:21.292297 2212400 logs.go:282] 0 containers: []
	W1219 06:58:21.292306 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:58:21.292312 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:58:21.292372 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:58:21.319665 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:58:21.319690 2212400 cri.go:92] found id: ""
	I1219 06:58:21.319698 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:58:21.319763 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:21.323578 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:58:21.323656 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:58:21.351175 2212400 cri.go:92] found id: ""
	I1219 06:58:21.351246 2212400 logs.go:282] 0 containers: []
	W1219 06:58:21.351269 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:58:21.351288 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:58:21.351403 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:58:21.384014 2212400 cri.go:92] found id: ""
	I1219 06:58:21.384096 2212400 logs.go:282] 0 containers: []
	W1219 06:58:21.384120 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:58:21.384166 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:58:21.384192 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:58:21.445710 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:58:21.445751 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:58:21.463018 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:58:21.463049 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:58:21.498498 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:58:21.498539 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:58:21.534973 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:58:21.535010 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:58:21.606144 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:58:21.606163 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:58:21.606185 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:58:21.639902 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:58:21.639934 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:58:21.677500 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:58:21.677532 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:58:21.720447 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:58:21.720479 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:58:24.253894 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:58:24.264025 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:58:24.264098 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:58:24.290780 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:58:24.290801 2212400 cri.go:92] found id: ""
	I1219 06:58:24.290809 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:58:24.290890 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:24.294805 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:58:24.294894 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:58:24.320705 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:58:24.320729 2212400 cri.go:92] found id: ""
	I1219 06:58:24.320737 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:58:24.320828 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:24.324552 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:58:24.324632 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:58:24.352865 2212400 cri.go:92] found id: ""
	I1219 06:58:24.352894 2212400 logs.go:282] 0 containers: []
	W1219 06:58:24.352904 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:58:24.352911 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:58:24.352994 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:58:24.380161 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:58:24.380183 2212400 cri.go:92] found id: ""
	I1219 06:58:24.380192 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:58:24.380255 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:24.383951 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:58:24.384025 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:58:24.413773 2212400 cri.go:92] found id: ""
	I1219 06:58:24.413795 2212400 logs.go:282] 0 containers: []
	W1219 06:58:24.413804 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:58:24.413810 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:58:24.413874 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:58:24.439082 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:58:24.439112 2212400 cri.go:92] found id: ""
	I1219 06:58:24.439122 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:58:24.439182 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:24.443161 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:58:24.443245 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:58:24.468894 2212400 cri.go:92] found id: ""
	I1219 06:58:24.468919 2212400 logs.go:282] 0 containers: []
	W1219 06:58:24.468928 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:58:24.468934 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:58:24.468995 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:58:24.501974 2212400 cri.go:92] found id: ""
	I1219 06:58:24.502003 2212400 logs.go:282] 0 containers: []
	W1219 06:58:24.502012 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:58:24.502029 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:58:24.502041 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:58:24.560526 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:58:24.560562 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:58:24.577015 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:58:24.577042 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:58:24.650370 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:58:24.650389 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:58:24.650401 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:58:24.701792 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:58:24.701866 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:58:24.737138 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:58:24.737175 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:58:24.777869 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:58:24.777903 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:58:24.813786 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:58:24.813836 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:58:24.843384 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:58:24.843411 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:58:27.395127 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:58:27.406889 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:58:27.406970 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:58:27.433562 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:58:27.433587 2212400 cri.go:92] found id: ""
	I1219 06:58:27.433595 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:58:27.433652 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:27.437421 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:58:27.437497 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:58:27.461990 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:58:27.462013 2212400 cri.go:92] found id: ""
	I1219 06:58:27.462020 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:58:27.462075 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:27.465765 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:58:27.465837 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:58:27.489631 2212400 cri.go:92] found id: ""
	I1219 06:58:27.489656 2212400 logs.go:282] 0 containers: []
	W1219 06:58:27.489665 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:58:27.489672 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:58:27.489732 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:58:27.519163 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:58:27.519186 2212400 cri.go:92] found id: ""
	I1219 06:58:27.519194 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:58:27.519251 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:27.522984 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:58:27.523061 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:58:27.547722 2212400 cri.go:92] found id: ""
	I1219 06:58:27.547749 2212400 logs.go:282] 0 containers: []
	W1219 06:58:27.547758 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:58:27.547764 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:58:27.547829 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:58:27.572480 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:58:27.572503 2212400 cri.go:92] found id: ""
	I1219 06:58:27.572511 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:58:27.572569 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:27.576010 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:58:27.576095 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:58:27.600207 2212400 cri.go:92] found id: ""
	I1219 06:58:27.600231 2212400 logs.go:282] 0 containers: []
	W1219 06:58:27.600240 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:58:27.600247 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:58:27.600352 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:58:27.625573 2212400 cri.go:92] found id: ""
	I1219 06:58:27.625597 2212400 logs.go:282] 0 containers: []
	W1219 06:58:27.625606 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:58:27.625643 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:58:27.625665 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:58:27.665234 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:58:27.665266 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:58:27.697227 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:58:27.697263 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:58:27.731329 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:58:27.731357 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:58:27.789713 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:58:27.789751 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:58:27.826775 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:58:27.826806 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:58:27.859311 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:58:27.859340 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:58:27.875406 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:58:27.875433 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:58:27.952813 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:58:27.952834 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:58:27.952847 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:58:30.487744 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:58:30.497923 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:58:30.497990 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:58:30.527058 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:58:30.527078 2212400 cri.go:92] found id: ""
	I1219 06:58:30.527085 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:58:30.527142 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:30.531314 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:58:30.531388 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:58:30.562053 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:58:30.562070 2212400 cri.go:92] found id: ""
	I1219 06:58:30.562078 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:58:30.562132 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:30.567070 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:58:30.567193 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:58:30.598982 2212400 cri.go:92] found id: ""
	I1219 06:58:30.599003 2212400 logs.go:282] 0 containers: []
	W1219 06:58:30.599011 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:58:30.599018 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:58:30.599078 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:58:30.624786 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:58:30.624856 2212400 cri.go:92] found id: ""
	I1219 06:58:30.624877 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:58:30.624963 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:30.629517 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:58:30.629649 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:58:30.676999 2212400 cri.go:92] found id: ""
	I1219 06:58:30.677074 2212400 logs.go:282] 0 containers: []
	W1219 06:58:30.677096 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:58:30.677116 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:58:30.677200 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:58:30.739660 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:58:30.739731 2212400 cri.go:92] found id: ""
	I1219 06:58:30.739752 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:58:30.739837 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:30.744224 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:58:30.744351 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:58:30.773815 2212400 cri.go:92] found id: ""
	I1219 06:58:30.773887 2212400 logs.go:282] 0 containers: []
	W1219 06:58:30.773909 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:58:30.773930 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:58:30.774013 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:58:30.806207 2212400 cri.go:92] found id: ""
	I1219 06:58:30.806294 2212400 logs.go:282] 0 containers: []
	W1219 06:58:30.806317 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:58:30.806345 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:58:30.806393 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:58:30.896934 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:58:30.897002 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:58:30.897029 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:58:30.937882 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:58:30.937987 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:58:30.991245 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:58:30.991276 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:58:31.039890 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:58:31.040008 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:58:31.106109 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:58:31.106196 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:58:31.153258 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:58:31.153331 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:58:31.186724 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:58:31.186756 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:58:31.221246 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:58:31.221278 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:58:33.745974 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:58:33.756203 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:58:33.756275 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:58:33.781929 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:58:33.781952 2212400 cri.go:92] found id: ""
	I1219 06:58:33.781960 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:58:33.782017 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:33.785647 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:58:33.785717 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:58:33.810466 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:58:33.810488 2212400 cri.go:92] found id: ""
	I1219 06:58:33.810496 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:58:33.810556 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:33.814342 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:58:33.814425 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:58:33.844208 2212400 cri.go:92] found id: ""
	I1219 06:58:33.844234 2212400 logs.go:282] 0 containers: []
	W1219 06:58:33.844243 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:58:33.844249 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:58:33.844311 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:58:33.869832 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:58:33.869853 2212400 cri.go:92] found id: ""
	I1219 06:58:33.869862 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:58:33.869919 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:33.873542 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:58:33.873612 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:58:33.898128 2212400 cri.go:92] found id: ""
	I1219 06:58:33.898153 2212400 logs.go:282] 0 containers: []
	W1219 06:58:33.898161 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:58:33.898168 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:58:33.898239 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:58:33.923545 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:58:33.923568 2212400 cri.go:92] found id: ""
	I1219 06:58:33.923576 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:58:33.923633 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:33.927355 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:58:33.927428 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:58:33.956110 2212400 cri.go:92] found id: ""
	I1219 06:58:33.956135 2212400 logs.go:282] 0 containers: []
	W1219 06:58:33.956144 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:58:33.956151 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:58:33.956210 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:58:33.981027 2212400 cri.go:92] found id: ""
	I1219 06:58:33.981050 2212400 logs.go:282] 0 containers: []
	W1219 06:58:33.981060 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:58:33.981075 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:58:33.981087 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:58:34.021578 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:58:34.021624 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:58:34.055729 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:58:34.055800 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:58:34.118107 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:58:34.118157 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:58:34.151937 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:58:34.152059 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:58:34.190750 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:58:34.190776 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:58:34.289877 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:58:34.289891 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:58:34.289903 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:58:34.341505 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:58:34.342153 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:58:34.430622 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:58:34.430657 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:58:36.961550 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:58:36.971762 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:58:36.971830 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:58:36.995984 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:58:36.996007 2212400 cri.go:92] found id: ""
	I1219 06:58:36.996015 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:58:36.996071 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:36.999722 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:58:36.999796 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:58:37.030854 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:58:37.030878 2212400 cri.go:92] found id: ""
	I1219 06:58:37.030887 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:58:37.030948 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:37.036238 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:58:37.036316 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:58:37.061981 2212400 cri.go:92] found id: ""
	I1219 06:58:37.062005 2212400 logs.go:282] 0 containers: []
	W1219 06:58:37.062014 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:58:37.062021 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:58:37.062086 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:58:37.086882 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:58:37.086907 2212400 cri.go:92] found id: ""
	I1219 06:58:37.086916 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:58:37.086974 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:37.090644 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:58:37.090731 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:58:37.114987 2212400 cri.go:92] found id: ""
	I1219 06:58:37.115013 2212400 logs.go:282] 0 containers: []
	W1219 06:58:37.115021 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:58:37.115028 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:58:37.115093 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:58:37.143148 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:58:37.143170 2212400 cri.go:92] found id: ""
	I1219 06:58:37.143178 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:58:37.143236 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:37.147140 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:58:37.147218 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:58:37.172009 2212400 cri.go:92] found id: ""
	I1219 06:58:37.172034 2212400 logs.go:282] 0 containers: []
	W1219 06:58:37.172043 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:58:37.172049 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:58:37.172110 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:58:37.198317 2212400 cri.go:92] found id: ""
	I1219 06:58:37.198341 2212400 logs.go:282] 0 containers: []
	W1219 06:58:37.198349 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:58:37.198363 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:58:37.198375 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:58:37.227245 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:58:37.227278 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:58:37.289975 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:58:37.290014 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:58:37.306727 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:58:37.306757 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:58:37.374963 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:58:37.374995 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:58:37.375009 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:58:37.434340 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:58:37.434374 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:58:37.482424 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:58:37.482454 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:58:37.511940 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:58:37.512016 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:58:37.552309 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:58:37.552340 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:58:40.089414 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:58:40.100407 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:58:40.100479 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:58:40.125988 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:58:40.126026 2212400 cri.go:92] found id: ""
	I1219 06:58:40.126035 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:58:40.126098 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:40.129912 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:58:40.129986 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:58:40.154992 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:58:40.155022 2212400 cri.go:92] found id: ""
	I1219 06:58:40.155032 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:58:40.155100 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:40.159188 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:58:40.159261 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:58:40.190526 2212400 cri.go:92] found id: ""
	I1219 06:58:40.190551 2212400 logs.go:282] 0 containers: []
	W1219 06:58:40.190559 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:58:40.190566 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:58:40.190631 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:58:40.215836 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:58:40.215859 2212400 cri.go:92] found id: ""
	I1219 06:58:40.215868 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:58:40.215927 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:40.219651 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:58:40.219723 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:58:40.246847 2212400 cri.go:92] found id: ""
	I1219 06:58:40.246872 2212400 logs.go:282] 0 containers: []
	W1219 06:58:40.246882 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:58:40.246888 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:58:40.246951 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:58:40.272052 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:58:40.272076 2212400 cri.go:92] found id: ""
	I1219 06:58:40.272084 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:58:40.272156 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:40.275740 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:58:40.275811 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:58:40.300973 2212400 cri.go:92] found id: ""
	I1219 06:58:40.300999 2212400 logs.go:282] 0 containers: []
	W1219 06:58:40.301009 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:58:40.301015 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:58:40.301078 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:58:40.325291 2212400 cri.go:92] found id: ""
	I1219 06:58:40.325317 2212400 logs.go:282] 0 containers: []
	W1219 06:58:40.325326 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:58:40.325340 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:58:40.325353 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:58:40.393672 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:58:40.393743 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:58:40.393770 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:58:40.431246 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:58:40.431321 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:58:40.463829 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:58:40.463902 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:58:40.504588 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:58:40.504621 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:58:40.534825 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:58:40.534859 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:58:40.563363 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:58:40.563438 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:58:40.626236 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:58:40.626273 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:58:40.643395 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:58:40.643423 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:58:43.181954 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:58:43.192230 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:58:43.192304 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:58:43.217391 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:58:43.217412 2212400 cri.go:92] found id: ""
	I1219 06:58:43.217420 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:58:43.217477 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:43.221262 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:58:43.221341 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:58:43.247330 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:58:43.247355 2212400 cri.go:92] found id: ""
	I1219 06:58:43.247364 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:58:43.247420 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:43.251031 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:58:43.251103 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:58:43.278256 2212400 cri.go:92] found id: ""
	I1219 06:58:43.278282 2212400 logs.go:282] 0 containers: []
	W1219 06:58:43.278291 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:58:43.278298 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:58:43.278360 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:58:43.308036 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:58:43.308058 2212400 cri.go:92] found id: ""
	I1219 06:58:43.308066 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:58:43.308123 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:43.311930 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:58:43.312004 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:58:43.343609 2212400 cri.go:92] found id: ""
	I1219 06:58:43.343641 2212400 logs.go:282] 0 containers: []
	W1219 06:58:43.343650 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:58:43.343656 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:58:43.343717 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:58:43.377048 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:58:43.377071 2212400 cri.go:92] found id: ""
	I1219 06:58:43.377085 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:58:43.380887 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:43.386423 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:58:43.386504 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:58:43.427340 2212400 cri.go:92] found id: ""
	I1219 06:58:43.427366 2212400 logs.go:282] 0 containers: []
	W1219 06:58:43.427383 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:58:43.427408 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:58:43.427483 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:58:43.457309 2212400 cri.go:92] found id: ""
	I1219 06:58:43.457337 2212400 logs.go:282] 0 containers: []
	W1219 06:58:43.457346 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:58:43.457391 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:58:43.457410 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:58:43.477322 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:58:43.477354 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:58:43.512655 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:58:43.512687 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:58:43.545682 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:58:43.545712 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:58:43.582691 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:58:43.582728 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:58:43.613229 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:58:43.613264 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:58:43.643344 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:58:43.643377 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:58:43.703895 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:58:43.703933 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:58:43.765076 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:58:43.765097 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:58:43.765111 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:58:46.313811 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:58:46.324283 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:58:46.324357 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:58:46.351067 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:58:46.351087 2212400 cri.go:92] found id: ""
	I1219 06:58:46.351096 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:58:46.351169 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:46.354950 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:58:46.355035 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:58:46.387132 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:58:46.387151 2212400 cri.go:92] found id: ""
	I1219 06:58:46.387159 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:58:46.387215 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:46.390857 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:58:46.390966 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:58:46.421018 2212400 cri.go:92] found id: ""
	I1219 06:58:46.421041 2212400 logs.go:282] 0 containers: []
	W1219 06:58:46.421049 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:58:46.421056 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:58:46.421114 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:58:46.459244 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:58:46.459263 2212400 cri.go:92] found id: ""
	I1219 06:58:46.459271 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:58:46.459329 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:46.463444 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:58:46.463512 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:58:46.492202 2212400 cri.go:92] found id: ""
	I1219 06:58:46.492278 2212400 logs.go:282] 0 containers: []
	W1219 06:58:46.492299 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:58:46.492317 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:58:46.492406 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:58:46.517300 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:58:46.517324 2212400 cri.go:92] found id: ""
	I1219 06:58:46.517331 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:58:46.517387 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:46.521089 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:58:46.521161 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:58:46.547892 2212400 cri.go:92] found id: ""
	I1219 06:58:46.547924 2212400 logs.go:282] 0 containers: []
	W1219 06:58:46.547933 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:58:46.547955 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:58:46.548041 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:58:46.572847 2212400 cri.go:92] found id: ""
	I1219 06:58:46.572921 2212400 logs.go:282] 0 containers: []
	W1219 06:58:46.572942 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:58:46.572958 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:58:46.572970 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:58:46.636002 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:58:46.636045 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:58:46.652549 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:58:46.652591 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:58:46.718787 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:58:46.718809 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:58:46.718824 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:58:46.752562 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:58:46.752601 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:58:46.787121 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:58:46.787153 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:58:46.817397 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:58:46.817434 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:58:46.849576 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:58:46.849611 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:58:46.884886 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:58:46.884920 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:58:49.423860 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:58:49.436408 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:58:49.436495 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:58:49.470083 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:58:49.470103 2212400 cri.go:92] found id: ""
	I1219 06:58:49.470111 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:58:49.470168 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:49.474014 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:58:49.474091 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:58:49.498857 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:58:49.498881 2212400 cri.go:92] found id: ""
	I1219 06:58:49.498889 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:58:49.498948 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:49.502628 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:58:49.502698 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:58:49.526884 2212400 cri.go:92] found id: ""
	I1219 06:58:49.526910 2212400 logs.go:282] 0 containers: []
	W1219 06:58:49.526920 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:58:49.526926 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:58:49.526987 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:58:49.552377 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:58:49.552404 2212400 cri.go:92] found id: ""
	I1219 06:58:49.552413 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:58:49.552475 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:49.556144 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:58:49.556219 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:58:49.580955 2212400 cri.go:92] found id: ""
	I1219 06:58:49.580980 2212400 logs.go:282] 0 containers: []
	W1219 06:58:49.580989 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:58:49.580995 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:58:49.581059 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:58:49.606766 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:58:49.606790 2212400 cri.go:92] found id: ""
	I1219 06:58:49.606799 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:58:49.606856 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:49.610763 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:58:49.610843 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:58:49.639033 2212400 cri.go:92] found id: ""
	I1219 06:58:49.639058 2212400 logs.go:282] 0 containers: []
	W1219 06:58:49.639067 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:58:49.639074 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:58:49.639135 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:58:49.665321 2212400 cri.go:92] found id: ""
	I1219 06:58:49.665348 2212400 logs.go:282] 0 containers: []
	W1219 06:58:49.665356 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:58:49.665374 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:58:49.665411 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:58:49.682066 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:58:49.682101 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:58:49.747436 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:58:49.747495 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:58:49.747522 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:58:49.782968 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:58:49.782999 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:58:49.823041 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:58:49.823074 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:58:49.861969 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:58:49.862008 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:58:49.895702 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:58:49.895733 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:58:49.955041 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:58:49.955078 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:58:49.982781 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:58:49.982817 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:58:52.536798 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:58:52.546687 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:58:52.546756 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:58:52.571197 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:58:52.571221 2212400 cri.go:92] found id: ""
	I1219 06:58:52.571229 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:58:52.571288 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:52.575070 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:58:52.575143 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:58:52.602434 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:58:52.602534 2212400 cri.go:92] found id: ""
	I1219 06:58:52.602543 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:58:52.602605 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:52.606475 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:58:52.606549 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:58:52.631276 2212400 cri.go:92] found id: ""
	I1219 06:58:52.631304 2212400 logs.go:282] 0 containers: []
	W1219 06:58:52.631313 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:58:52.631320 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:58:52.631382 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:58:52.658888 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:58:52.658911 2212400 cri.go:92] found id: ""
	I1219 06:58:52.658918 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:58:52.658978 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:52.663220 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:58:52.663293 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:58:52.695844 2212400 cri.go:92] found id: ""
	I1219 06:58:52.695874 2212400 logs.go:282] 0 containers: []
	W1219 06:58:52.695883 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:58:52.695889 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:58:52.695951 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:58:52.722263 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:58:52.722287 2212400 cri.go:92] found id: ""
	I1219 06:58:52.722295 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:58:52.722354 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:52.726103 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:58:52.726187 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:58:52.751237 2212400 cri.go:92] found id: ""
	I1219 06:58:52.751265 2212400 logs.go:282] 0 containers: []
	W1219 06:58:52.751274 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:58:52.751281 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:58:52.751390 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:58:52.775846 2212400 cri.go:92] found id: ""
	I1219 06:58:52.775875 2212400 logs.go:282] 0 containers: []
	W1219 06:58:52.775884 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:58:52.775919 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:58:52.775937 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:58:52.834068 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:58:52.834104 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:58:52.898536 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:58:52.898558 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:58:52.898572 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:58:52.933328 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:58:52.933358 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:58:52.964510 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:58:52.964542 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:58:53.000709 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:58:53.000739 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:58:53.044953 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:58:53.044982 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:58:53.073132 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:58:53.073167 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:58:53.089484 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:58:53.089513 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:58:55.628875 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:58:55.646783 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:58:55.646870 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:58:55.687346 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:58:55.687369 2212400 cri.go:92] found id: ""
	I1219 06:58:55.687377 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:58:55.687435 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:55.692973 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:58:55.693078 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:58:55.728165 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:58:55.728253 2212400 cri.go:92] found id: ""
	I1219 06:58:55.728277 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:58:55.728377 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:55.733592 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:58:55.733678 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:58:55.771500 2212400 cri.go:92] found id: ""
	I1219 06:58:55.771524 2212400 logs.go:282] 0 containers: []
	W1219 06:58:55.771533 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:58:55.771539 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:58:55.771604 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:58:55.809423 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:58:55.809445 2212400 cri.go:92] found id: ""
	I1219 06:58:55.809453 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:58:55.809515 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:55.813891 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:58:55.814016 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:58:55.845582 2212400 cri.go:92] found id: ""
	I1219 06:58:55.845663 2212400 logs.go:282] 0 containers: []
	W1219 06:58:55.845682 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:58:55.845690 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:58:55.845760 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:58:55.871772 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:58:55.871796 2212400 cri.go:92] found id: ""
	I1219 06:58:55.871805 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:58:55.871894 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:55.877382 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:58:55.877496 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:58:55.903262 2212400 cri.go:92] found id: ""
	I1219 06:58:55.903289 2212400 logs.go:282] 0 containers: []
	W1219 06:58:55.903299 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:58:55.903305 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:58:55.903424 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:58:55.929895 2212400 cri.go:92] found id: ""
	I1219 06:58:55.929919 2212400 logs.go:282] 0 containers: []
	W1219 06:58:55.929927 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:58:55.929945 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:58:55.929957 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:58:55.964010 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:58:55.964044 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:58:56.005378 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:58:56.005417 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:58:56.052162 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:58:56.052198 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:58:56.081545 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:58:56.081582 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:58:56.113273 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:58:56.113304 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:58:56.176054 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:58:56.176098 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:58:56.193299 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:58:56.193329 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:58:56.264012 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:58:56.264038 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:58:56.264051 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:58:58.798882 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:58:58.809142 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:58:58.809217 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:58:58.836651 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:58:58.836679 2212400 cri.go:92] found id: ""
	I1219 06:58:58.836687 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:58:58.836748 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:58.841236 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:58:58.841311 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:58:58.879082 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:58:58.879108 2212400 cri.go:92] found id: ""
	I1219 06:58:58.879116 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:58:58.879171 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:58.884218 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:58:58.884304 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:58:58.924282 2212400 cri.go:92] found id: ""
	I1219 06:58:58.924317 2212400 logs.go:282] 0 containers: []
	W1219 06:58:58.924326 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:58:58.924333 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:58:58.924399 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:58:58.959063 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:58:58.959089 2212400 cri.go:92] found id: ""
	I1219 06:58:58.959104 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:58:58.959160 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:58.964284 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:58:58.964368 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:58:58.996639 2212400 cri.go:92] found id: ""
	I1219 06:58:58.996667 2212400 logs.go:282] 0 containers: []
	W1219 06:58:58.996676 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:58:58.996682 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:58:58.996741 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:58:59.040920 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:58:59.040945 2212400 cri.go:92] found id: ""
	I1219 06:58:59.040954 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:58:59.041009 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:58:59.045477 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:58:59.045553 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:58:59.084707 2212400 cri.go:92] found id: ""
	I1219 06:58:59.084735 2212400 logs.go:282] 0 containers: []
	W1219 06:58:59.084744 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:58:59.084750 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:58:59.084832 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:58:59.121971 2212400 cri.go:92] found id: ""
	I1219 06:58:59.121997 2212400 logs.go:282] 0 containers: []
	W1219 06:58:59.122027 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:58:59.122043 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:58:59.122060 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:58:59.180436 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:58:59.192743 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:58:59.279889 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:58:59.279964 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:58:59.365786 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:58:59.365849 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:58:59.365874 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:58:59.404701 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:58:59.404806 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:58:59.453682 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:58:59.453712 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:58:59.515796 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:58:59.515831 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:58:59.532402 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:58:59.532432 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:58:59.575931 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:58:59.575961 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:59:02.109962 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:59:02.122700 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:59:02.122771 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:59:02.178525 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:59:02.178605 2212400 cri.go:92] found id: ""
	I1219 06:59:02.178627 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:59:02.178714 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:59:02.185671 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:59:02.185772 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:59:02.251119 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:59:02.251139 2212400 cri.go:92] found id: ""
	I1219 06:59:02.251148 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:59:02.251207 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:59:02.255780 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:59:02.255940 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:59:02.300102 2212400 cri.go:92] found id: ""
	I1219 06:59:02.300129 2212400 logs.go:282] 0 containers: []
	W1219 06:59:02.300138 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:59:02.300144 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:59:02.300226 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:59:02.341120 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:59:02.341144 2212400 cri.go:92] found id: ""
	I1219 06:59:02.341153 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:59:02.341238 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:59:02.345562 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:59:02.345657 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:59:02.381664 2212400 cri.go:92] found id: ""
	I1219 06:59:02.381689 2212400 logs.go:282] 0 containers: []
	W1219 06:59:02.381699 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:59:02.381705 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:59:02.381788 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:59:02.426144 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:59:02.426169 2212400 cri.go:92] found id: ""
	I1219 06:59:02.426185 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:59:02.426263 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:59:02.430219 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:59:02.430317 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:59:02.461737 2212400 cri.go:92] found id: ""
	I1219 06:59:02.461820 2212400 logs.go:282] 0 containers: []
	W1219 06:59:02.461845 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:59:02.461868 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:59:02.461965 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:59:02.504844 2212400 cri.go:92] found id: ""
	I1219 06:59:02.504977 2212400 logs.go:282] 0 containers: []
	W1219 06:59:02.505001 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:59:02.505050 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:59:02.505085 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:59:02.550813 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:59:02.550888 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:59:02.639030 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:59:02.639113 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:59:02.684218 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:59:02.684304 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:59:02.731481 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:59:02.731559 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:59:02.754710 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:59:02.754790 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:59:02.836440 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:59:02.836510 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:59:02.836537 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:59:02.907692 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:59:02.907730 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:59:02.995660 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:59:02.995738 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:59:05.539256 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:59:05.549328 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:59:05.549402 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:59:05.574635 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:59:05.574658 2212400 cri.go:92] found id: ""
	I1219 06:59:05.574666 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:59:05.574723 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:59:05.578282 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:59:05.578357 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:59:05.602950 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:59:05.602972 2212400 cri.go:92] found id: ""
	I1219 06:59:05.602980 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:59:05.603038 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:59:05.606729 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:59:05.606798 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:59:05.631492 2212400 cri.go:92] found id: ""
	I1219 06:59:05.631515 2212400 logs.go:282] 0 containers: []
	W1219 06:59:05.631524 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:59:05.631530 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:59:05.631589 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:59:05.657240 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:59:05.657263 2212400 cri.go:92] found id: ""
	I1219 06:59:05.657270 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:59:05.657326 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:59:05.661369 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:59:05.661437 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:59:05.693063 2212400 cri.go:92] found id: ""
	I1219 06:59:05.693084 2212400 logs.go:282] 0 containers: []
	W1219 06:59:05.693093 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:59:05.693099 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:59:05.693159 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:59:05.733313 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:59:05.733332 2212400 cri.go:92] found id: ""
	I1219 06:59:05.733339 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:59:05.733396 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:59:05.737750 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:59:05.737824 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:59:05.764409 2212400 cri.go:92] found id: ""
	I1219 06:59:05.764431 2212400 logs.go:282] 0 containers: []
	W1219 06:59:05.764439 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:59:05.764445 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:59:05.764506 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:59:05.800819 2212400 cri.go:92] found id: ""
	I1219 06:59:05.800841 2212400 logs.go:282] 0 containers: []
	W1219 06:59:05.800855 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:59:05.800869 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:59:05.800880 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:59:05.866185 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:59:05.866268 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:59:05.919606 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:59:05.919686 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:59:05.959235 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:59:05.959309 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:59:06.021141 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:59:06.021220 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:59:06.086712 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:59:06.086787 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:59:06.105403 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:59:06.105494 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:59:06.192588 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:59:06.192652 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:59:06.192676 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:59:06.223047 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:59:06.223080 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:59:08.766250 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:59:08.776575 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:59:08.776651 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:59:08.802852 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:59:08.802877 2212400 cri.go:92] found id: ""
	I1219 06:59:08.802886 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:59:08.802948 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:59:08.806713 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:59:08.806785 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:59:08.831271 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:59:08.831294 2212400 cri.go:92] found id: ""
	I1219 06:59:08.831302 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:59:08.831360 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:59:08.835181 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:59:08.835257 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:59:08.861014 2212400 cri.go:92] found id: ""
	I1219 06:59:08.861037 2212400 logs.go:282] 0 containers: []
	W1219 06:59:08.861045 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:59:08.861051 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:59:08.861118 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:59:08.885966 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:59:08.885987 2212400 cri.go:92] found id: ""
	I1219 06:59:08.885995 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:59:08.886055 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:59:08.889720 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:59:08.889793 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:59:08.918460 2212400 cri.go:92] found id: ""
	I1219 06:59:08.918488 2212400 logs.go:282] 0 containers: []
	W1219 06:59:08.918497 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:59:08.918503 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:59:08.918568 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:59:08.955200 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:59:08.955226 2212400 cri.go:92] found id: ""
	I1219 06:59:08.955235 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:59:08.955292 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:59:08.965453 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:59:08.965532 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:59:08.992678 2212400 cri.go:92] found id: ""
	I1219 06:59:08.992701 2212400 logs.go:282] 0 containers: []
	W1219 06:59:08.992710 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:59:08.992717 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:59:08.992802 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:59:09.019923 2212400 cri.go:92] found id: ""
	I1219 06:59:09.019950 2212400 logs.go:282] 0 containers: []
	W1219 06:59:09.019959 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:59:09.019975 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:59:09.019990 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:59:09.036554 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:59:09.036584 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:59:09.072855 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:59:09.072886 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:59:09.106579 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:59:09.106612 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:59:09.172361 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:59:09.172406 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:59:09.239642 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:59:09.239665 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:59:09.239679 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:59:09.274608 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:59:09.274640 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:59:09.317071 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:59:09.317147 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:59:09.355445 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:59:09.355507 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:59:11.904342 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:59:11.914804 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 06:59:11.914878 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 06:59:11.945786 2212400 cri.go:92] found id: "a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:59:11.945812 2212400 cri.go:92] found id: ""
	I1219 06:59:11.945820 2212400 logs.go:282] 1 containers: [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207]
	I1219 06:59:11.945877 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:59:11.950014 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 06:59:11.950093 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 06:59:11.978410 2212400 cri.go:92] found id: "178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:59:11.978434 2212400 cri.go:92] found id: ""
	I1219 06:59:11.978443 2212400 logs.go:282] 1 containers: [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb]
	I1219 06:59:11.978530 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:59:11.982353 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 06:59:11.982426 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 06:59:12.009827 2212400 cri.go:92] found id: ""
	I1219 06:59:12.009858 2212400 logs.go:282] 0 containers: []
	W1219 06:59:12.009869 2212400 logs.go:284] No container was found matching "coredns"
	I1219 06:59:12.009876 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 06:59:12.009942 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 06:59:12.037054 2212400 cri.go:92] found id: "d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:59:12.037076 2212400 cri.go:92] found id: ""
	I1219 06:59:12.037084 2212400 logs.go:282] 1 containers: [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5]
	I1219 06:59:12.037142 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:59:12.040916 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 06:59:12.040989 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 06:59:12.066132 2212400 cri.go:92] found id: ""
	I1219 06:59:12.066157 2212400 logs.go:282] 0 containers: []
	W1219 06:59:12.066166 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 06:59:12.066178 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 06:59:12.066242 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 06:59:12.095980 2212400 cri.go:92] found id: "77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:59:12.096004 2212400 cri.go:92] found id: ""
	I1219 06:59:12.096012 2212400 logs.go:282] 1 containers: [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea]
	I1219 06:59:12.096071 2212400 ssh_runner.go:195] Run: which crictl
	I1219 06:59:12.099835 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 06:59:12.099907 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 06:59:12.125063 2212400 cri.go:92] found id: ""
	I1219 06:59:12.125088 2212400 logs.go:282] 0 containers: []
	W1219 06:59:12.125098 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 06:59:12.125104 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 06:59:12.125172 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 06:59:12.150802 2212400 cri.go:92] found id: ""
	I1219 06:59:12.150829 2212400 logs.go:282] 0 containers: []
	W1219 06:59:12.150838 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 06:59:12.150852 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 06:59:12.150885 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 06:59:12.180236 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 06:59:12.180273 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1219 06:59:12.237635 2212400 logs.go:123] Gathering logs for kube-apiserver [a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207] ...
	I1219 06:59:12.237670 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207"
	I1219 06:59:12.273209 2212400 logs.go:123] Gathering logs for etcd [178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb] ...
	I1219 06:59:12.273243 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb"
	I1219 06:59:12.311851 2212400 logs.go:123] Gathering logs for kube-scheduler [d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5] ...
	I1219 06:59:12.311882 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5"
	I1219 06:59:12.352235 2212400 logs.go:123] Gathering logs for container status ...
	I1219 06:59:12.352267 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 06:59:12.393047 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 06:59:12.393082 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 06:59:12.409783 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 06:59:12.409810 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 06:59:12.472090 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 06:59:12.472113 2212400 logs.go:123] Gathering logs for kube-controller-manager [77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea] ...
	I1219 06:59:12.472164 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea"
	I1219 06:59:15.016886 2212400 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:59:15.030532 2212400 kubeadm.go:602] duration metric: took 4m4.91758494s to restartPrimaryControlPlane
	W1219 06:59:15.030613 2212400 out.go:285] ! Unable to restart control-plane node(s), will reset cluster: <no value>
	! Unable to restart control-plane node(s), will reset cluster: <no value>
	I1219 06:59:15.030690 2212400 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1219 06:59:15.504787 2212400 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1219 06:59:15.518515 2212400 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1219 06:59:15.526832 2212400 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1219 06:59:15.526902 2212400 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1219 06:59:15.534857 2212400 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1219 06:59:15.534879 2212400 kubeadm.go:158] found existing configuration files:
	
	I1219 06:59:15.534930 2212400 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I1219 06:59:15.542653 2212400 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1219 06:59:15.542720 2212400 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1219 06:59:15.550468 2212400 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I1219 06:59:15.558162 2212400 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1219 06:59:15.558239 2212400 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1219 06:59:15.565573 2212400 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I1219 06:59:15.573436 2212400 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1219 06:59:15.573502 2212400 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1219 06:59:15.580947 2212400 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I1219 06:59:15.589519 2212400 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1219 06:59:15.589587 2212400 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1219 06:59:15.596946 2212400 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1219 06:59:15.636836 2212400 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-rc.1
	I1219 06:59:15.637180 2212400 kubeadm.go:319] [preflight] Running pre-flight checks
	I1219 06:59:15.710388 2212400 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1219 06:59:15.710465 2212400 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1219 06:59:15.710505 2212400 kubeadm.go:319] OS: Linux
	I1219 06:59:15.710554 2212400 kubeadm.go:319] CGROUPS_CPU: enabled
	I1219 06:59:15.710609 2212400 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1219 06:59:15.710659 2212400 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1219 06:59:15.710711 2212400 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1219 06:59:15.710763 2212400 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1219 06:59:15.710815 2212400 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1219 06:59:15.710870 2212400 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1219 06:59:15.710920 2212400 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1219 06:59:15.710972 2212400 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1219 06:59:15.775250 2212400 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1219 06:59:15.775366 2212400 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1219 06:59:15.775460 2212400 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1219 06:59:25.185906 2212400 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1219 06:59:25.188861 2212400 out.go:252]   - Generating certificates and keys ...
	I1219 06:59:25.188953 2212400 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1219 06:59:25.189043 2212400 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1219 06:59:25.189129 2212400 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1219 06:59:25.189192 2212400 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1219 06:59:25.189262 2212400 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1219 06:59:25.189316 2212400 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1219 06:59:25.189379 2212400 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1219 06:59:25.189440 2212400 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1219 06:59:25.189669 2212400 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1219 06:59:25.190103 2212400 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1219 06:59:25.190416 2212400 kubeadm.go:319] [certs] Using the existing "sa" key
	I1219 06:59:25.190500 2212400 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1219 06:59:25.360372 2212400 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1219 06:59:25.537897 2212400 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1219 06:59:25.689393 2212400 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1219 06:59:26.129695 2212400 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1219 06:59:26.426254 2212400 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1219 06:59:26.427069 2212400 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1219 06:59:26.429739 2212400 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1219 06:59:26.433260 2212400 out.go:252]   - Booting up control plane ...
	I1219 06:59:26.433368 2212400 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1219 06:59:26.433446 2212400 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1219 06:59:26.433513 2212400 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1219 06:59:26.455584 2212400 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1219 06:59:26.455702 2212400 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1219 06:59:26.463740 2212400 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1219 06:59:26.464171 2212400 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1219 06:59:26.464391 2212400 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1219 06:59:26.591840 2212400 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1219 06:59:26.591969 2212400 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1219 07:03:26.595982 2212400 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.001280942s
	I1219 07:03:26.596017 2212400 kubeadm.go:319] 
	I1219 07:03:26.596076 2212400 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1219 07:03:26.596109 2212400 kubeadm.go:319] 	- The kubelet is not running
	I1219 07:03:26.596213 2212400 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1219 07:03:26.596219 2212400 kubeadm.go:319] 
	I1219 07:03:26.596323 2212400 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1219 07:03:26.596355 2212400 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1219 07:03:26.596386 2212400 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1219 07:03:26.596390 2212400 kubeadm.go:319] 
	I1219 07:03:26.596901 2212400 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1219 07:03:26.597352 2212400 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1219 07:03:26.597467 2212400 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1219 07:03:26.597719 2212400 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1219 07:03:26.597724 2212400 kubeadm.go:319] 
	I1219 07:03:26.597797 2212400 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	W1219 07:03:26.598031 2212400 out.go:285] ! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001280942s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001280942s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	I1219 07:03:26.598179 2212400 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1219 07:03:27.027221 2212400 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1219 07:03:27.045698 2212400 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1219 07:03:27.045763 2212400 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1219 07:03:27.057574 2212400 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1219 07:03:27.057598 2212400 kubeadm.go:158] found existing configuration files:
	
	I1219 07:03:27.057654 2212400 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I1219 07:03:27.067826 2212400 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1219 07:03:27.067932 2212400 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1219 07:03:27.076905 2212400 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I1219 07:03:27.086469 2212400 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1219 07:03:27.086542 2212400 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1219 07:03:27.094937 2212400 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I1219 07:03:27.104316 2212400 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1219 07:03:27.104385 2212400 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1219 07:03:27.114325 2212400 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I1219 07:03:27.124316 2212400 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1219 07:03:27.124384 2212400 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1219 07:03:27.134729 2212400 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1219 07:03:27.189911 2212400 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-rc.1
	I1219 07:03:27.190195 2212400 kubeadm.go:319] [preflight] Running pre-flight checks
	I1219 07:03:27.345672 2212400 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1219 07:03:27.345744 2212400 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1219 07:03:27.345786 2212400 kubeadm.go:319] OS: Linux
	I1219 07:03:27.345836 2212400 kubeadm.go:319] CGROUPS_CPU: enabled
	I1219 07:03:27.345889 2212400 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1219 07:03:27.346368 2212400 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1219 07:03:27.346451 2212400 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1219 07:03:27.346511 2212400 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1219 07:03:27.346567 2212400 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1219 07:03:27.346613 2212400 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1219 07:03:27.346662 2212400 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1219 07:03:27.346709 2212400 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1219 07:03:27.447159 2212400 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1219 07:03:27.447277 2212400 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1219 07:03:27.447376 2212400 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1219 07:03:27.457775 2212400 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1219 07:03:27.463324 2212400 out.go:252]   - Generating certificates and keys ...
	I1219 07:03:27.463423 2212400 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1219 07:03:27.463491 2212400 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1219 07:03:27.463570 2212400 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1219 07:03:27.463637 2212400 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1219 07:03:27.463712 2212400 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1219 07:03:27.463770 2212400 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1219 07:03:27.463837 2212400 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1219 07:03:27.463902 2212400 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1219 07:03:27.463998 2212400 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1219 07:03:27.464205 2212400 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1219 07:03:27.464561 2212400 kubeadm.go:319] [certs] Using the existing "sa" key
	I1219 07:03:27.464686 2212400 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1219 07:03:27.654289 2212400 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1219 07:03:27.967546 2212400 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1219 07:03:28.037186 2212400 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1219 07:03:28.342406 2212400 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1219 07:03:28.428732 2212400 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1219 07:03:28.429291 2212400 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1219 07:03:28.432042 2212400 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1219 07:03:28.435362 2212400 out.go:252]   - Booting up control plane ...
	I1219 07:03:28.435469 2212400 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1219 07:03:28.435548 2212400 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1219 07:03:28.435627 2212400 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1219 07:03:28.455284 2212400 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1219 07:03:28.455780 2212400 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1219 07:03:28.464190 2212400 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1219 07:03:28.465548 2212400 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1219 07:03:28.465975 2212400 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1219 07:03:28.601761 2212400 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1219 07:03:28.601893 2212400 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1219 07:07:28.602406 2212400 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.000904725s
	I1219 07:07:28.602457 2212400 kubeadm.go:319] 
	I1219 07:07:28.602516 2212400 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1219 07:07:28.602552 2212400 kubeadm.go:319] 	- The kubelet is not running
	I1219 07:07:28.602665 2212400 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1219 07:07:28.602672 2212400 kubeadm.go:319] 
	I1219 07:07:28.602776 2212400 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1219 07:07:28.602807 2212400 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1219 07:07:28.602838 2212400 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1219 07:07:28.602842 2212400 kubeadm.go:319] 
	I1219 07:07:28.606599 2212400 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1219 07:07:28.607025 2212400 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1219 07:07:28.607132 2212400 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1219 07:07:28.607366 2212400 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1219 07:07:28.607373 2212400 kubeadm.go:319] 
	I1219 07:07:28.607441 2212400 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1219 07:07:28.607493 2212400 kubeadm.go:403] duration metric: took 12m18.671397417s to StartCluster
	I1219 07:07:28.607527 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 07:07:28.607587 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 07:07:28.634525 2212400 cri.go:92] found id: ""
	I1219 07:07:28.634558 2212400 logs.go:282] 0 containers: []
	W1219 07:07:28.634568 2212400 logs.go:284] No container was found matching "kube-apiserver"
	I1219 07:07:28.634574 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 07:07:28.634633 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 07:07:28.674733 2212400 cri.go:92] found id: ""
	I1219 07:07:28.674756 2212400 logs.go:282] 0 containers: []
	W1219 07:07:28.674765 2212400 logs.go:284] No container was found matching "etcd"
	I1219 07:07:28.674771 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 07:07:28.674836 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 07:07:28.722927 2212400 cri.go:92] found id: ""
	I1219 07:07:28.722948 2212400 logs.go:282] 0 containers: []
	W1219 07:07:28.722957 2212400 logs.go:284] No container was found matching "coredns"
	I1219 07:07:28.722963 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 07:07:28.723032 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 07:07:28.767653 2212400 cri.go:92] found id: ""
	I1219 07:07:28.767675 2212400 logs.go:282] 0 containers: []
	W1219 07:07:28.767684 2212400 logs.go:284] No container was found matching "kube-scheduler"
	I1219 07:07:28.767691 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 07:07:28.767752 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 07:07:28.797822 2212400 cri.go:92] found id: ""
	I1219 07:07:28.797844 2212400 logs.go:282] 0 containers: []
	W1219 07:07:28.797852 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 07:07:28.797864 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 07:07:28.797922 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 07:07:28.826985 2212400 cri.go:92] found id: ""
	I1219 07:07:28.827007 2212400 logs.go:282] 0 containers: []
	W1219 07:07:28.827016 2212400 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 07:07:28.827023 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 07:07:28.827082 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 07:07:28.863020 2212400 cri.go:92] found id: ""
	I1219 07:07:28.863096 2212400 logs.go:282] 0 containers: []
	W1219 07:07:28.863128 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 07:07:28.863148 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 07:07:28.863238 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 07:07:28.891494 2212400 cri.go:92] found id: ""
	I1219 07:07:28.891516 2212400 logs.go:282] 0 containers: []
	W1219 07:07:28.891524 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 07:07:28.891536 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 07:07:28.891548 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 07:07:28.937806 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 07:07:28.937892 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 07:07:29.048357 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 07:07:29.048428 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 07:07:29.048443 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 07:07:29.099207 2212400 logs.go:123] Gathering logs for container status ...
	I1219 07:07:29.099289 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 07:07:29.143691 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 07:07:29.143715 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	W1219 07:07:29.208467 2212400 out.go:434] Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000904725s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	W1219 07:07:29.208575 2212400 out.go:285] * 
	* 
	W1219 07:07:29.208658 2212400 out.go:285] X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000904725s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000904725s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1219 07:07:29.208712 2212400 out.go:285] * 
	* 
	W1219 07:07:29.210920 2212400 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1219 07:07:29.216145 2212400 out.go:203] 
	W1219 07:07:29.219984 2212400 out.go:285] X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000904725s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000904725s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1219 07:07:29.220090 2212400 out.go:285] * Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	* Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	W1219 07:07:29.220149 2212400 out.go:285] * Related issue: https://github.com/kubernetes/minikube/issues/4172
	* Related issue: https://github.com/kubernetes/minikube/issues/4172
	I1219 07:07:29.223701 2212400 out.go:203] 

                                                
                                                
** /stderr **
version_upgrade_test.go:245: failed to upgrade with newest k8s version. args: out/minikube-linux-arm64 start -p kubernetes-upgrade-352421 --memory=3072 --kubernetes-version=v1.35.0-rc.1 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd : exit status 109
version_upgrade_test.go:248: (dbg) Run:  kubectl --context kubernetes-upgrade-352421 version --output=json
version_upgrade_test.go:248: (dbg) Non-zero exit: kubectl --context kubernetes-upgrade-352421 version --output=json: exit status 1 (147.459363ms)

                                                
                                                
-- stdout --
	{
	  "clientVersion": {
	    "major": "1",
	    "minor": "33",
	    "gitVersion": "v1.33.2",
	    "gitCommit": "a57b6f7709f6c2722b92f07b8b4c48210a51fc40",
	    "gitTreeState": "clean",
	    "buildDate": "2025-06-17T18:41:31Z",
	    "goVersion": "go1.24.4",
	    "compiler": "gc",
	    "platform": "linux/arm64"
	  },
	  "kustomizeVersion": "v5.6.0"
	}

                                                
                                                
-- /stdout --
** stderr ** 
	The connection to the server 192.168.76.2:8443 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
version_upgrade_test.go:250: error running kubectl: exit status 1
panic.go:615: *** TestKubernetesUpgrade FAILED at 2025-12-19 07:07:30.373723901 +0000 UTC m=+5065.695308277
helpers_test.go:223: -----------------------post-mortem--------------------------------
helpers_test.go:224: ======>  post-mortem[TestKubernetesUpgrade]: network settings <======
helpers_test.go:231: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:239: ======>  post-mortem[TestKubernetesUpgrade]: docker inspect <======
helpers_test.go:240: (dbg) Run:  docker inspect kubernetes-upgrade-352421
helpers_test.go:244: (dbg) docker inspect kubernetes-upgrade-352421:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "d877c65fc4858cc6f6bceb6253ded01b98ff230739b320b0f93bfa6ff8ddac90",
	        "Created": "2025-12-19T06:54:21.588176371Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 2212527,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-19T06:54:53.366210742Z",
	            "FinishedAt": "2025-12-19T06:54:52.335586517Z"
	        },
	        "Image": "sha256:1411dfa4fea1291ce69fcd55acb99f3fbff3e701cee30fdd4f0b2561ac0ef6b0",
	        "ResolvConfPath": "/var/lib/docker/containers/d877c65fc4858cc6f6bceb6253ded01b98ff230739b320b0f93bfa6ff8ddac90/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/d877c65fc4858cc6f6bceb6253ded01b98ff230739b320b0f93bfa6ff8ddac90/hostname",
	        "HostsPath": "/var/lib/docker/containers/d877c65fc4858cc6f6bceb6253ded01b98ff230739b320b0f93bfa6ff8ddac90/hosts",
	        "LogPath": "/var/lib/docker/containers/d877c65fc4858cc6f6bceb6253ded01b98ff230739b320b0f93bfa6ff8ddac90/d877c65fc4858cc6f6bceb6253ded01b98ff230739b320b0f93bfa6ff8ddac90-json.log",
	        "Name": "/kubernetes-upgrade-352421",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "kubernetes-upgrade-352421:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "kubernetes-upgrade-352421",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 3221225472,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 6442450944,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "d877c65fc4858cc6f6bceb6253ded01b98ff230739b320b0f93bfa6ff8ddac90",
	                "LowerDir": "/var/lib/docker/overlay2/25cf4f92924bc54ae34dbe8f1507427f9a305ecee189f8302f0c32a44b0efa09-init/diff:/var/lib/docker/overlay2/00358d85eab3b52f9d297862c5ac97673efd866f7bb8f8781bf0c1744f50abc5/diff",
	                "MergedDir": "/var/lib/docker/overlay2/25cf4f92924bc54ae34dbe8f1507427f9a305ecee189f8302f0c32a44b0efa09/merged",
	                "UpperDir": "/var/lib/docker/overlay2/25cf4f92924bc54ae34dbe8f1507427f9a305ecee189f8302f0c32a44b0efa09/diff",
	                "WorkDir": "/var/lib/docker/overlay2/25cf4f92924bc54ae34dbe8f1507427f9a305ecee189f8302f0c32a44b0efa09/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "kubernetes-upgrade-352421",
	                "Source": "/var/lib/docker/volumes/kubernetes-upgrade-352421/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "kubernetes-upgrade-352421",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8443/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "kubernetes-upgrade-352421",
	                "name.minikube.sigs.k8s.io": "kubernetes-upgrade-352421",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "0f6e2b3e39f8c58583308fd64d5c29afe6238523b6beb26fe21715e881a536cd",
	            "SandboxKey": "/var/run/docker/netns/0f6e2b3e39f8",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34929"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34930"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34933"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34931"
	                    }
	                ],
	                "8443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "34932"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "kubernetes-upgrade-352421": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.76.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "a6:d7:ef:bc:d7:87",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "36b67d41ec30845431259e78d63f8bd1971ca90bdd815bd032564d18cbbbbbff",
	                    "EndpointID": "ba1071b0dc385cdcfc98202fb2bc88bd9747b5b65a4258adc498a3040a1d503e",
	                    "Gateway": "192.168.76.1",
	                    "IPAddress": "192.168.76.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "kubernetes-upgrade-352421",
	                        "d877c65fc485"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:248: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p kubernetes-upgrade-352421 -n kubernetes-upgrade-352421
helpers_test.go:248: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p kubernetes-upgrade-352421 -n kubernetes-upgrade-352421: exit status 2 (457.502619ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:248: status error: exit status 2 (may be ok)
helpers_test.go:253: <<< TestKubernetesUpgrade FAILED: start of post-mortem logs <<<
helpers_test.go:254: ======>  post-mortem[TestKubernetesUpgrade]: minikube logs <======
helpers_test.go:256: (dbg) Run:  out/minikube-linux-arm64 -p kubernetes-upgrade-352421 logs -n 25
helpers_test.go:261: TestKubernetesUpgrade logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬──────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬──────────────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                       ARGS                                                       │         PROFILE          │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼──────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼──────────────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ ssh     │ -p cilium-248561 sudo systemctl status kubelet --all --full --no-pager                                           │ cilium-248561            │ jenkins │ v1.37.0 │ 19 Dec 25 07:07 UTC │                     │
	│ ssh     │ -p cilium-248561 sudo systemctl cat kubelet --no-pager                                                           │ cilium-248561            │ jenkins │ v1.37.0 │ 19 Dec 25 07:07 UTC │                     │
	│ ssh     │ -p cilium-248561 sudo journalctl -xeu kubelet --all --full --no-pager                                            │ cilium-248561            │ jenkins │ v1.37.0 │ 19 Dec 25 07:07 UTC │                     │
	│ ssh     │ -p cilium-248561 sudo cat /etc/kubernetes/kubelet.conf                                                           │ cilium-248561            │ jenkins │ v1.37.0 │ 19 Dec 25 07:07 UTC │                     │
	│ ssh     │ -p cilium-248561 sudo cat /var/lib/kubelet/config.yaml                                                           │ cilium-248561            │ jenkins │ v1.37.0 │ 19 Dec 25 07:07 UTC │                     │
	│ ssh     │ -p cilium-248561 sudo systemctl status docker --all --full --no-pager                                            │ cilium-248561            │ jenkins │ v1.37.0 │ 19 Dec 25 07:07 UTC │                     │
	│ ssh     │ -p cilium-248561 sudo systemctl cat docker --no-pager                                                            │ cilium-248561            │ jenkins │ v1.37.0 │ 19 Dec 25 07:07 UTC │                     │
	│ ssh     │ -p cilium-248561 sudo cat /etc/docker/daemon.json                                                                │ cilium-248561            │ jenkins │ v1.37.0 │ 19 Dec 25 07:07 UTC │                     │
	│ ssh     │ -p cilium-248561 sudo docker system info                                                                         │ cilium-248561            │ jenkins │ v1.37.0 │ 19 Dec 25 07:07 UTC │                     │
	│ ssh     │ -p cilium-248561 sudo systemctl status cri-docker --all --full --no-pager                                        │ cilium-248561            │ jenkins │ v1.37.0 │ 19 Dec 25 07:07 UTC │                     │
	│ ssh     │ -p cilium-248561 sudo systemctl cat cri-docker --no-pager                                                        │ cilium-248561            │ jenkins │ v1.37.0 │ 19 Dec 25 07:07 UTC │                     │
	│ ssh     │ -p cilium-248561 sudo cat /etc/systemd/system/cri-docker.service.d/10-cni.conf                                   │ cilium-248561            │ jenkins │ v1.37.0 │ 19 Dec 25 07:07 UTC │                     │
	│ ssh     │ -p cilium-248561 sudo cat /usr/lib/systemd/system/cri-docker.service                                             │ cilium-248561            │ jenkins │ v1.37.0 │ 19 Dec 25 07:07 UTC │                     │
	│ ssh     │ -p cilium-248561 sudo cri-dockerd --version                                                                      │ cilium-248561            │ jenkins │ v1.37.0 │ 19 Dec 25 07:07 UTC │                     │
	│ ssh     │ -p cilium-248561 sudo systemctl status containerd --all --full --no-pager                                        │ cilium-248561            │ jenkins │ v1.37.0 │ 19 Dec 25 07:07 UTC │                     │
	│ ssh     │ -p cilium-248561 sudo systemctl cat containerd --no-pager                                                        │ cilium-248561            │ jenkins │ v1.37.0 │ 19 Dec 25 07:07 UTC │                     │
	│ ssh     │ -p cilium-248561 sudo cat /lib/systemd/system/containerd.service                                                 │ cilium-248561            │ jenkins │ v1.37.0 │ 19 Dec 25 07:07 UTC │                     │
	│ ssh     │ -p cilium-248561 sudo cat /etc/containerd/config.toml                                                            │ cilium-248561            │ jenkins │ v1.37.0 │ 19 Dec 25 07:07 UTC │                     │
	│ ssh     │ -p cilium-248561 sudo containerd config dump                                                                     │ cilium-248561            │ jenkins │ v1.37.0 │ 19 Dec 25 07:07 UTC │                     │
	│ ssh     │ -p cilium-248561 sudo systemctl status crio --all --full --no-pager                                              │ cilium-248561            │ jenkins │ v1.37.0 │ 19 Dec 25 07:07 UTC │                     │
	│ ssh     │ -p cilium-248561 sudo systemctl cat crio --no-pager                                                              │ cilium-248561            │ jenkins │ v1.37.0 │ 19 Dec 25 07:07 UTC │                     │
	│ ssh     │ -p cilium-248561 sudo find /etc/crio -type f -exec sh -c 'echo {}; cat {}' \;                                    │ cilium-248561            │ jenkins │ v1.37.0 │ 19 Dec 25 07:07 UTC │                     │
	│ ssh     │ -p cilium-248561 sudo crio config                                                                                │ cilium-248561            │ jenkins │ v1.37.0 │ 19 Dec 25 07:07 UTC │                     │
	│ delete  │ -p cilium-248561                                                                                                 │ cilium-248561            │ jenkins │ v1.37.0 │ 19 Dec 25 07:07 UTC │ 19 Dec 25 07:07 UTC │
	│ start   │ -p force-systemd-env-631273 --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd │ force-systemd-env-631273 │ jenkins │ v1.37.0 │ 19 Dec 25 07:07 UTC │                     │
	└─────────┴──────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴──────────────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/19 07:07:08
	Running on machine: ip-172-31-30-239
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1219 07:07:08.729473 2259035 out.go:360] Setting OutFile to fd 1 ...
	I1219 07:07:08.729687 2259035 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 07:07:08.729713 2259035 out.go:374] Setting ErrFile to fd 2...
	I1219 07:07:08.729735 2259035 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 07:07:08.730042 2259035 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22230-1998525/.minikube/bin
	I1219 07:07:08.730550 2259035 out.go:368] Setting JSON to false
	I1219 07:07:08.731455 2259035 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":42575,"bootTime":1766085454,"procs":174,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1219 07:07:08.731554 2259035 start.go:143] virtualization:  
	I1219 07:07:08.736934 2259035 out.go:179] * [force-systemd-env-631273] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1219 07:07:08.739912 2259035 out.go:179]   - MINIKUBE_LOCATION=22230
	I1219 07:07:08.739982 2259035 notify.go:221] Checking for updates...
	I1219 07:07:08.745767 2259035 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1219 07:07:08.748656 2259035 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22230-1998525/kubeconfig
	I1219 07:07:08.751470 2259035 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22230-1998525/.minikube
	I1219 07:07:08.754343 2259035 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1219 07:07:08.757236 2259035 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=true
	I1219 07:07:08.760823 2259035 config.go:182] Loaded profile config "kubernetes-upgrade-352421": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1219 07:07:08.760935 2259035 driver.go:422] Setting default libvirt URI to qemu:///system
	I1219 07:07:08.786382 2259035 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1219 07:07:08.786495 2259035 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1219 07:07:08.841528 2259035 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:38 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-19 07:07:08.832404725 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1219 07:07:08.841640 2259035 docker.go:319] overlay module found
	I1219 07:07:08.844747 2259035 out.go:179] * Using the docker driver based on user configuration
	I1219 07:07:08.847704 2259035 start.go:309] selected driver: docker
	I1219 07:07:08.847723 2259035 start.go:928] validating driver "docker" against <nil>
	I1219 07:07:08.847737 2259035 start.go:939] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1219 07:07:08.848467 2259035 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1219 07:07:08.910838 2259035 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:38 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-19 07:07:08.901062905 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1219 07:07:08.910998 2259035 start_flags.go:327] no existing cluster config was found, will generate one from the flags 
	I1219 07:07:08.911230 2259035 start_flags.go:975] Wait components to verify : map[apiserver:true system_pods:true]
	I1219 07:07:08.914194 2259035 out.go:179] * Using Docker driver with root privileges
	I1219 07:07:08.917041 2259035 cni.go:84] Creating CNI manager for ""
	I1219 07:07:08.917119 2259035 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1219 07:07:08.917138 2259035 start_flags.go:336] Found "CNI" CNI - setting NetworkPlugin=cni
	I1219 07:07:08.917223 2259035 start.go:353] cluster config:
	{Name:force-systemd-env-631273 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:force-systemd-env-631273 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.
local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.34.3 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1219 07:07:08.920379 2259035 out.go:179] * Starting "force-systemd-env-631273" primary control-plane node in "force-systemd-env-631273" cluster
	I1219 07:07:08.923291 2259035 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1219 07:07:08.926206 2259035 out.go:179] * Pulling base image v0.0.48-1765966054-22186 ...
	I1219 07:07:08.928960 2259035 preload.go:188] Checking if preload exists for k8s version v1.34.3 and runtime containerd
	I1219 07:07:08.929019 2259035 preload.go:203] Found local preload: /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.3-containerd-overlay2-arm64.tar.lz4
	I1219 07:07:08.929035 2259035 cache.go:65] Caching tarball of preloaded images
	I1219 07:07:08.929046 2259035 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon
	I1219 07:07:08.929135 2259035 preload.go:238] Found /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.3-containerd-overlay2-arm64.tar.lz4 in cache, skipping download
	I1219 07:07:08.929146 2259035 cache.go:68] Finished verifying existence of preloaded tar for v1.34.3 on containerd
	I1219 07:07:08.929254 2259035 profile.go:143] Saving config to /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/force-systemd-env-631273/config.json ...
	I1219 07:07:08.929271 2259035 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/force-systemd-env-631273/config.json: {Name:mkd82595759266f465139af0ca627da9ba301b01 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1219 07:07:08.948392 2259035 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon, skipping pull
	I1219 07:07:08.948414 2259035 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 exists in daemon, skipping load
	I1219 07:07:08.948441 2259035 cache.go:243] Successfully downloaded all kic artifacts
	I1219 07:07:08.948471 2259035 start.go:360] acquireMachinesLock for force-systemd-env-631273: {Name:mkc060d724e30658c9f07732b3a254e7fb09b5f1 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1219 07:07:08.948589 2259035 start.go:364] duration metric: took 97.97µs to acquireMachinesLock for "force-systemd-env-631273"
	I1219 07:07:08.948620 2259035 start.go:93] Provisioning new machine with config: &{Name:force-systemd-env-631273 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:force-systemd-env-631273 Namespace:default APIServerHA
VIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.34.3 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath:
StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.34.3 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I1219 07:07:08.948694 2259035 start.go:125] createHost starting for "" (driver="docker")
	I1219 07:07:08.953955 2259035 out.go:252] * Creating docker container (CPUs=2, Memory=3072MB) ...
	I1219 07:07:08.954211 2259035 start.go:159] libmachine.API.Create for "force-systemd-env-631273" (driver="docker")
	I1219 07:07:08.954253 2259035 client.go:173] LocalClient.Create starting
	I1219 07:07:08.954327 2259035 main.go:144] libmachine: Reading certificate data from /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem
	I1219 07:07:08.954371 2259035 main.go:144] libmachine: Decoding PEM data...
	I1219 07:07:08.954390 2259035 main.go:144] libmachine: Parsing certificate...
	I1219 07:07:08.954457 2259035 main.go:144] libmachine: Reading certificate data from /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/cert.pem
	I1219 07:07:08.954478 2259035 main.go:144] libmachine: Decoding PEM data...
	I1219 07:07:08.954493 2259035 main.go:144] libmachine: Parsing certificate...
	I1219 07:07:08.954873 2259035 cli_runner.go:164] Run: docker network inspect force-systemd-env-631273 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	W1219 07:07:08.969245 2259035 cli_runner.go:211] docker network inspect force-systemd-env-631273 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}" returned with exit code 1
	I1219 07:07:08.969330 2259035 network_create.go:284] running [docker network inspect force-systemd-env-631273] to gather additional debugging logs...
	I1219 07:07:08.969350 2259035 cli_runner.go:164] Run: docker network inspect force-systemd-env-631273
	W1219 07:07:08.983992 2259035 cli_runner.go:211] docker network inspect force-systemd-env-631273 returned with exit code 1
	I1219 07:07:08.984047 2259035 network_create.go:287] error running [docker network inspect force-systemd-env-631273]: docker network inspect force-systemd-env-631273: exit status 1
	stdout:
	[]
	
	stderr:
	Error response from daemon: network force-systemd-env-631273 not found
	I1219 07:07:08.984075 2259035 network_create.go:289] output of [docker network inspect force-systemd-env-631273]: -- stdout --
	[]
	
	-- /stdout --
	** stderr ** 
	Error response from daemon: network force-systemd-env-631273 not found
	
	** /stderr **
	I1219 07:07:08.984194 2259035 cli_runner.go:164] Run: docker network inspect bridge --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1219 07:07:08.999355 2259035 network.go:211] skipping subnet 192.168.49.0/24 that is taken: &{IP:192.168.49.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.49.0/24 Gateway:192.168.49.1 ClientMin:192.168.49.2 ClientMax:192.168.49.254 Broadcast:192.168.49.255 IsPrivate:true Interface:{IfaceName:br-e2ac0574767c IfaceIPv4:192.168.49.1 IfaceMTU:1500 IfaceMAC:e2:05:6c:fe:29:8c} reservation:<nil>}
	I1219 07:07:08.999741 2259035 network.go:211] skipping subnet 192.168.58.0/24 that is taken: &{IP:192.168.58.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.58.0/24 Gateway:192.168.58.1 ClientMin:192.168.58.2 ClientMax:192.168.58.254 Broadcast:192.168.58.255 IsPrivate:true Interface:{IfaceName:br-6625893d0542 IfaceIPv4:192.168.58.1 IfaceMTU:1500 IfaceMAC:4e:f5:d7:88:e8:be} reservation:<nil>}
	I1219 07:07:09.000115 2259035 network.go:211] skipping subnet 192.168.67.0/24 that is taken: &{IP:192.168.67.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.67.0/24 Gateway:192.168.67.1 ClientMin:192.168.67.2 ClientMax:192.168.67.254 Broadcast:192.168.67.255 IsPrivate:true Interface:{IfaceName:br-08c301f9fb62 IfaceIPv4:192.168.67.1 IfaceMTU:1500 IfaceMAC:8a:69:23:b7:05:19} reservation:<nil>}
	I1219 07:07:09.000510 2259035 network.go:211] skipping subnet 192.168.76.0/24 that is taken: &{IP:192.168.76.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.76.0/24 Gateway:192.168.76.1 ClientMin:192.168.76.2 ClientMax:192.168.76.254 Broadcast:192.168.76.255 IsPrivate:true Interface:{IfaceName:br-36b67d41ec30 IfaceIPv4:192.168.76.1 IfaceMTU:1500 IfaceMAC:0e:cd:a7:76:ef:f3} reservation:<nil>}
	I1219 07:07:09.001200 2259035 network.go:206] using free private subnet 192.168.85.0/24: &{IP:192.168.85.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.85.0/24 Gateway:192.168.85.1 ClientMin:192.168.85.2 ClientMax:192.168.85.254 Broadcast:192.168.85.255 IsPrivate:true Interface:{IfaceName: IfaceIPv4: IfaceMTU:0 IfaceMAC:} reservation:0x4001869ca0}
	I1219 07:07:09.001243 2259035 network_create.go:124] attempt to create docker network force-systemd-env-631273 192.168.85.0/24 with gateway 192.168.85.1 and MTU of 1500 ...
	I1219 07:07:09.001308 2259035 cli_runner.go:164] Run: docker network create --driver=bridge --subnet=192.168.85.0/24 --gateway=192.168.85.1 -o --ip-masq -o --icc -o com.docker.network.driver.mtu=1500 --label=created_by.minikube.sigs.k8s.io=true --label=name.minikube.sigs.k8s.io=force-systemd-env-631273 force-systemd-env-631273
	I1219 07:07:09.076315 2259035 network_create.go:108] docker network force-systemd-env-631273 192.168.85.0/24 created
	I1219 07:07:09.076370 2259035 kic.go:121] calculated static IP "192.168.85.2" for the "force-systemd-env-631273" container
	I1219 07:07:09.076461 2259035 cli_runner.go:164] Run: docker ps -a --format {{.Names}}
	I1219 07:07:09.093712 2259035 cli_runner.go:164] Run: docker volume create force-systemd-env-631273 --label name.minikube.sigs.k8s.io=force-systemd-env-631273 --label created_by.minikube.sigs.k8s.io=true
	I1219 07:07:09.111526 2259035 oci.go:103] Successfully created a docker volume force-systemd-env-631273
	I1219 07:07:09.111621 2259035 cli_runner.go:164] Run: docker run --rm --name force-systemd-env-631273-preload-sidecar --label created_by.minikube.sigs.k8s.io=true --label name.minikube.sigs.k8s.io=force-systemd-env-631273 --entrypoint /usr/bin/test -v force-systemd-env-631273:/var gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 -d /var/lib
	I1219 07:07:09.651608 2259035 oci.go:107] Successfully prepared a docker volume force-systemd-env-631273
	I1219 07:07:09.651694 2259035 preload.go:188] Checking if preload exists for k8s version v1.34.3 and runtime containerd
	I1219 07:07:09.651706 2259035 kic.go:194] Starting extracting preloaded images to volume ...
	I1219 07:07:09.651772 2259035 cli_runner.go:164] Run: docker run --rm --entrypoint /usr/bin/tar -v /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.3-containerd-overlay2-arm64.tar.lz4:/preloaded.tar:ro -v force-systemd-env-631273:/extractDir gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 -I lz4 -xf /preloaded.tar -C /extractDir
	I1219 07:07:13.621613 2259035 cli_runner.go:217] Completed: docker run --rm --entrypoint /usr/bin/tar -v /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.3-containerd-overlay2-arm64.tar.lz4:/preloaded.tar:ro -v force-systemd-env-631273:/extractDir gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 -I lz4 -xf /preloaded.tar -C /extractDir: (3.969804756s)
	I1219 07:07:13.621647 2259035 kic.go:203] duration metric: took 3.969937664s to extract preloaded images to volume ...
	W1219 07:07:13.621788 2259035 cgroups_linux.go:77] Your kernel does not support swap limit capabilities or the cgroup is not mounted.
	I1219 07:07:13.621909 2259035 cli_runner.go:164] Run: docker info --format "'{{json .SecurityOptions}}'"
	I1219 07:07:13.674610 2259035 cli_runner.go:164] Run: docker run -d -t --privileged --security-opt seccomp=unconfined --tmpfs /tmp --tmpfs /run -v /lib/modules:/lib/modules:ro --hostname force-systemd-env-631273 --name force-systemd-env-631273 --label created_by.minikube.sigs.k8s.io=true --label name.minikube.sigs.k8s.io=force-systemd-env-631273 --label role.minikube.sigs.k8s.io= --label mode.minikube.sigs.k8s.io=force-systemd-env-631273 --network force-systemd-env-631273 --ip 192.168.85.2 --volume force-systemd-env-631273:/var --security-opt apparmor=unconfined --memory=3072mb --cpus=2 -e container=docker --expose 8443 --publish=127.0.0.1::8443 --publish=127.0.0.1::22 --publish=127.0.0.1::2376 --publish=127.0.0.1::5000 --publish=127.0.0.1::32443 gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0
	I1219 07:07:13.980519 2259035 cli_runner.go:164] Run: docker container inspect force-systemd-env-631273 --format={{.State.Running}}
	I1219 07:07:14.006531 2259035 cli_runner.go:164] Run: docker container inspect force-systemd-env-631273 --format={{.State.Status}}
	I1219 07:07:14.029687 2259035 cli_runner.go:164] Run: docker exec force-systemd-env-631273 stat /var/lib/dpkg/alternatives/iptables
	I1219 07:07:14.076505 2259035 oci.go:144] the created container "force-systemd-env-631273" has a running status.
	I1219 07:07:14.076541 2259035 kic.go:225] Creating ssh key for kic: /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/force-systemd-env-631273/id_rsa...
	I1219 07:07:14.291324 2259035 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/force-systemd-env-631273/id_rsa.pub -> /home/docker/.ssh/authorized_keys
	I1219 07:07:14.291429 2259035 kic_runner.go:191] docker (temp): /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/force-systemd-env-631273/id_rsa.pub --> /home/docker/.ssh/authorized_keys (381 bytes)
	I1219 07:07:14.318680 2259035 cli_runner.go:164] Run: docker container inspect force-systemd-env-631273 --format={{.State.Status}}
	I1219 07:07:14.353876 2259035 kic_runner.go:93] Run: chown docker:docker /home/docker/.ssh/authorized_keys
	I1219 07:07:14.353912 2259035 kic_runner.go:114] Args: [docker exec --privileged force-systemd-env-631273 chown docker:docker /home/docker/.ssh/authorized_keys]
	I1219 07:07:14.415616 2259035 cli_runner.go:164] Run: docker container inspect force-systemd-env-631273 --format={{.State.Status}}
	I1219 07:07:14.438383 2259035 machine.go:94] provisionDockerMachine start ...
	I1219 07:07:14.438486 2259035 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" force-systemd-env-631273
	I1219 07:07:14.466750 2259035 main.go:144] libmachine: Using SSH client type: native
	I1219 07:07:14.467118 2259035 main.go:144] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db5e0] 0x3ddae0 <nil>  [] 0s} 127.0.0.1 34959 <nil> <nil>}
	I1219 07:07:14.467130 2259035 main.go:144] libmachine: About to run SSH command:
	hostname
	I1219 07:07:14.468038 2259035 main.go:144] libmachine: Error dialing TCP: ssh: handshake failed: read tcp 127.0.0.1:46362->127.0.0.1:34959: read: connection reset by peer
	I1219 07:07:17.624443 2259035 main.go:144] libmachine: SSH cmd err, output: <nil>: force-systemd-env-631273
	
	I1219 07:07:17.624468 2259035 ubuntu.go:182] provisioning hostname "force-systemd-env-631273"
	I1219 07:07:17.624534 2259035 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" force-systemd-env-631273
	I1219 07:07:17.642840 2259035 main.go:144] libmachine: Using SSH client type: native
	I1219 07:07:17.643168 2259035 main.go:144] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db5e0] 0x3ddae0 <nil>  [] 0s} 127.0.0.1 34959 <nil> <nil>}
	I1219 07:07:17.643186 2259035 main.go:144] libmachine: About to run SSH command:
	sudo hostname force-systemd-env-631273 && echo "force-systemd-env-631273" | sudo tee /etc/hostname
	I1219 07:07:17.814747 2259035 main.go:144] libmachine: SSH cmd err, output: <nil>: force-systemd-env-631273
	
	I1219 07:07:17.814915 2259035 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" force-systemd-env-631273
	I1219 07:07:17.834147 2259035 main.go:144] libmachine: Using SSH client type: native
	I1219 07:07:17.834489 2259035 main.go:144] libmachine: &{{{<nil> 0 [] [] []} docker [0x3db5e0] 0x3ddae0 <nil>  [] 0s} 127.0.0.1 34959 <nil> <nil>}
	I1219 07:07:17.834512 2259035 main.go:144] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sforce-systemd-env-631273' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 force-systemd-env-631273/g' /etc/hosts;
				else 
					echo '127.0.1.1 force-systemd-env-631273' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1219 07:07:17.993129 2259035 main.go:144] libmachine: SSH cmd err, output: <nil>: 
	I1219 07:07:17.993156 2259035 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/22230-1998525/.minikube CaCertPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/22230-1998525/.minikube}
	I1219 07:07:17.993180 2259035 ubuntu.go:190] setting up certificates
	I1219 07:07:17.993190 2259035 provision.go:84] configureAuth start
	I1219 07:07:17.993264 2259035 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" force-systemd-env-631273
	I1219 07:07:18.016043 2259035 provision.go:143] copyHostCerts
	I1219 07:07:18.016097 2259035 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.pem
	I1219 07:07:18.016134 2259035 exec_runner.go:144] found /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.pem, removing ...
	I1219 07:07:18.016155 2259035 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.pem
	I1219 07:07:18.016246 2259035 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.pem (1078 bytes)
	I1219 07:07:18.016338 2259035 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/22230-1998525/.minikube/cert.pem
	I1219 07:07:18.016361 2259035 exec_runner.go:144] found /home/jenkins/minikube-integration/22230-1998525/.minikube/cert.pem, removing ...
	I1219 07:07:18.016366 2259035 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22230-1998525/.minikube/cert.pem
	I1219 07:07:18.016402 2259035 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/22230-1998525/.minikube/cert.pem (1123 bytes)
	I1219 07:07:18.016454 2259035 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/22230-1998525/.minikube/key.pem
	I1219 07:07:18.016475 2259035 exec_runner.go:144] found /home/jenkins/minikube-integration/22230-1998525/.minikube/key.pem, removing ...
	I1219 07:07:18.016479 2259035 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22230-1998525/.minikube/key.pem
	I1219 07:07:18.016506 2259035 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/22230-1998525/.minikube/key.pem (1671 bytes)
	I1219 07:07:18.016555 2259035 provision.go:117] generating server cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca-key.pem org=jenkins.force-systemd-env-631273 san=[127.0.0.1 192.168.85.2 force-systemd-env-631273 localhost minikube]
	I1219 07:07:18.143823 2259035 provision.go:177] copyRemoteCerts
	I1219 07:07:18.143900 2259035 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1219 07:07:18.143941 2259035 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" force-systemd-env-631273
	I1219 07:07:18.161083 2259035 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34959 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/force-systemd-env-631273/id_rsa Username:docker}
	I1219 07:07:18.272744 2259035 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server.pem -> /etc/docker/server.pem
	I1219 07:07:18.272815 2259035 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server.pem --> /etc/docker/server.pem (1237 bytes)
	I1219 07:07:18.290486 2259035 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I1219 07:07:18.290548 2259035 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1219 07:07:18.307774 2259035 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I1219 07:07:18.307835 2259035 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I1219 07:07:18.325189 2259035 provision.go:87] duration metric: took 331.977281ms to configureAuth
	I1219 07:07:18.325216 2259035 ubuntu.go:206] setting minikube options for container-runtime
	I1219 07:07:18.325399 2259035 config.go:182] Loaded profile config "force-systemd-env-631273": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
	I1219 07:07:18.325406 2259035 machine.go:97] duration metric: took 3.887001893s to provisionDockerMachine
	I1219 07:07:18.325413 2259035 client.go:176] duration metric: took 9.371150613s to LocalClient.Create
	I1219 07:07:18.325426 2259035 start.go:167] duration metric: took 9.371217658s to libmachine.API.Create "force-systemd-env-631273"
	I1219 07:07:18.325433 2259035 start.go:293] postStartSetup for "force-systemd-env-631273" (driver="docker")
	I1219 07:07:18.325442 2259035 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1219 07:07:18.325494 2259035 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1219 07:07:18.325535 2259035 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" force-systemd-env-631273
	I1219 07:07:18.343845 2259035 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34959 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/force-systemd-env-631273/id_rsa Username:docker}
	I1219 07:07:18.454211 2259035 ssh_runner.go:195] Run: cat /etc/os-release
	I1219 07:07:18.457817 2259035 main.go:144] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1219 07:07:18.457851 2259035 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1219 07:07:18.457864 2259035 filesync.go:126] Scanning /home/jenkins/minikube-integration/22230-1998525/.minikube/addons for local assets ...
	I1219 07:07:18.457922 2259035 filesync.go:126] Scanning /home/jenkins/minikube-integration/22230-1998525/.minikube/files for local assets ...
	I1219 07:07:18.458024 2259035 filesync.go:149] local asset: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem -> 20003862.pem in /etc/ssl/certs
	I1219 07:07:18.458037 2259035 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem -> /etc/ssl/certs/20003862.pem
	I1219 07:07:18.458142 2259035 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I1219 07:07:18.466003 2259035 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem --> /etc/ssl/certs/20003862.pem (1708 bytes)
	I1219 07:07:18.483556 2259035 start.go:296] duration metric: took 158.107915ms for postStartSetup
	I1219 07:07:18.483921 2259035 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" force-systemd-env-631273
	I1219 07:07:18.500423 2259035 profile.go:143] Saving config to /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/force-systemd-env-631273/config.json ...
	I1219 07:07:18.500697 2259035 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1219 07:07:18.500748 2259035 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" force-systemd-env-631273
	I1219 07:07:18.517546 2259035 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34959 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/force-systemd-env-631273/id_rsa Username:docker}
	I1219 07:07:18.621900 2259035 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1219 07:07:18.626979 2259035 start.go:128] duration metric: took 9.67826729s to createHost
	I1219 07:07:18.627006 2259035 start.go:83] releasing machines lock for "force-systemd-env-631273", held for 9.678404416s
	I1219 07:07:18.627079 2259035 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" force-systemd-env-631273
	I1219 07:07:18.644123 2259035 ssh_runner.go:195] Run: cat /version.json
	I1219 07:07:18.644179 2259035 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" force-systemd-env-631273
	I1219 07:07:18.644438 2259035 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1219 07:07:18.644507 2259035 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" force-systemd-env-631273
	I1219 07:07:18.666705 2259035 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34959 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/force-systemd-env-631273/id_rsa Username:docker}
	I1219 07:07:18.692887 2259035 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34959 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/force-systemd-env-631273/id_rsa Username:docker}
	I1219 07:07:18.776661 2259035 ssh_runner.go:195] Run: systemctl --version
	I1219 07:07:18.866788 2259035 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1219 07:07:18.871280 2259035 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1219 07:07:18.871407 2259035 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1219 07:07:18.898884 2259035 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist, /etc/cni/net.d/10-crio-bridge.conflist.disabled] bridge cni config(s)
	I1219 07:07:18.898916 2259035 start.go:496] detecting cgroup driver to use...
	I1219 07:07:18.898943 2259035 start.go:500] using "systemd" cgroup driver as enforced via flags
	I1219 07:07:18.899005 2259035 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1219 07:07:18.914921 2259035 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1219 07:07:18.928325 2259035 docker.go:218] disabling cri-docker service (if available) ...
	I1219 07:07:18.928450 2259035 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1219 07:07:18.946787 2259035 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1219 07:07:18.965556 2259035 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1219 07:07:19.079157 2259035 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1219 07:07:19.220576 2259035 docker.go:234] disabling docker service ...
	I1219 07:07:19.220680 2259035 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1219 07:07:19.243610 2259035 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1219 07:07:19.256895 2259035 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1219 07:07:19.379169 2259035 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1219 07:07:19.494926 2259035 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1219 07:07:19.508147 2259035 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1219 07:07:19.523515 2259035 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1219 07:07:19.533303 2259035 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1219 07:07:19.542906 2259035 containerd.go:146] configuring containerd to use "systemd" as cgroup driver...
	I1219 07:07:19.542986 2259035 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = true|g' /etc/containerd/config.toml"
	I1219 07:07:19.551748 2259035 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1219 07:07:19.560498 2259035 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1219 07:07:19.569480 2259035 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1219 07:07:19.578289 2259035 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1219 07:07:19.586301 2259035 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1219 07:07:19.595240 2259035 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1219 07:07:19.604183 2259035 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1219 07:07:19.613722 2259035 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1219 07:07:19.621220 2259035 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1219 07:07:19.628422 2259035 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1219 07:07:19.748868 2259035 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1219 07:07:19.886864 2259035 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1219 07:07:19.887015 2259035 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1219 07:07:19.891267 2259035 start.go:564] Will wait 60s for crictl version
	I1219 07:07:19.891423 2259035 ssh_runner.go:195] Run: which crictl
	I1219 07:07:19.895094 2259035 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1219 07:07:19.923426 2259035 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.2.0
	RuntimeApiVersion:  v1
	I1219 07:07:19.923511 2259035 ssh_runner.go:195] Run: containerd --version
	I1219 07:07:19.956027 2259035 ssh_runner.go:195] Run: containerd --version
	I1219 07:07:19.983864 2259035 out.go:179] * Preparing Kubernetes v1.34.3 on containerd 2.2.0 ...
	I1219 07:07:19.986830 2259035 cli_runner.go:164] Run: docker network inspect force-systemd-env-631273 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1219 07:07:20.008821 2259035 ssh_runner.go:195] Run: grep 192.168.85.1	host.minikube.internal$ /etc/hosts
	I1219 07:07:20.017076 2259035 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.85.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I1219 07:07:20.029032 2259035 kubeadm.go:884] updating cluster {Name:force-systemd-env-631273 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:force-systemd-env-631273 Namespace:default APIServerHAVIP: APIServerName:
minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.85.2 Port:8443 KubernetesVersion:v1.34.3 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticI
P: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1219 07:07:20.029185 2259035 preload.go:188] Checking if preload exists for k8s version v1.34.3 and runtime containerd
	I1219 07:07:20.029283 2259035 ssh_runner.go:195] Run: sudo crictl images --output json
	I1219 07:07:20.058644 2259035 containerd.go:627] all images are preloaded for containerd runtime.
	I1219 07:07:20.058673 2259035 containerd.go:534] Images already preloaded, skipping extraction
	I1219 07:07:20.058744 2259035 ssh_runner.go:195] Run: sudo crictl images --output json
	I1219 07:07:20.085430 2259035 containerd.go:627] all images are preloaded for containerd runtime.
	I1219 07:07:20.085456 2259035 cache_images.go:86] Images are preloaded, skipping loading
	I1219 07:07:20.085464 2259035 kubeadm.go:935] updating node { 192.168.85.2 8443 v1.34.3 containerd true true} ...
	I1219 07:07:20.085568 2259035 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.34.3/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=force-systemd-env-631273 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.85.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.34.3 ClusterName:force-systemd-env-631273 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1219 07:07:20.085646 2259035 ssh_runner.go:195] Run: sudo crictl --timeout=10s info
	I1219 07:07:20.112886 2259035 cni.go:84] Creating CNI manager for ""
	I1219 07:07:20.112913 2259035 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1219 07:07:20.112961 2259035 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1219 07:07:20.112992 2259035 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.85.2 APIServerPort:8443 KubernetesVersion:v1.34.3 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:force-systemd-env-631273 NodeName:force-systemd-env-631273 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.85.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.85.2 CgroupDriver:systemd ClientCAFile:/var/lib/minikube/certs/ca.c
rt StaticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1219 07:07:20.113159 2259035 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.85.2
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "force-systemd-env-631273"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.85.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.85.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.34.3
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: systemd
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1219 07:07:20.113247 2259035 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.34.3
	I1219 07:07:20.121735 2259035 binaries.go:51] Found k8s binaries, skipping transfer
	I1219 07:07:20.121863 2259035 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1219 07:07:20.136113 2259035 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (328 bytes)
	I1219 07:07:20.149851 2259035 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I1219 07:07:20.164168 2259035 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2236 bytes)
	I1219 07:07:20.177272 2259035 ssh_runner.go:195] Run: grep 192.168.85.2	control-plane.minikube.internal$ /etc/hosts
	I1219 07:07:20.180972 2259035 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.85.2	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I1219 07:07:20.191637 2259035 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1219 07:07:20.307593 2259035 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1219 07:07:20.325281 2259035 certs.go:69] Setting up /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/force-systemd-env-631273 for IP: 192.168.85.2
	I1219 07:07:20.325353 2259035 certs.go:195] generating shared ca certs ...
	I1219 07:07:20.325399 2259035 certs.go:227] acquiring lock for ca certs: {Name:mk382c71693ea4061363f97b153b21bf6cdf5f38 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1219 07:07:20.325582 2259035 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.key
	I1219 07:07:20.325666 2259035 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/proxy-client-ca.key
	I1219 07:07:20.325703 2259035 certs.go:257] generating profile certs ...
	I1219 07:07:20.325796 2259035 certs.go:364] generating signed profile cert for "minikube-user": /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/force-systemd-env-631273/client.key
	I1219 07:07:20.325828 2259035 crypto.go:68] Generating cert /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/force-systemd-env-631273/client.crt with IP's: []
	I1219 07:07:20.600927 2259035 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/force-systemd-env-631273/client.crt ...
	I1219 07:07:20.600961 2259035 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/force-systemd-env-631273/client.crt: {Name:mk4dd17b332185c82b1a0af4502150d751ac1dc8 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1219 07:07:20.601164 2259035 crypto.go:164] Writing key to /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/force-systemd-env-631273/client.key ...
	I1219 07:07:20.601178 2259035 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/force-systemd-env-631273/client.key: {Name:mk123544028991e9adc6ee80aeaa1aaf961acf9a Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1219 07:07:20.601268 2259035 certs.go:364] generating signed profile cert for "minikube": /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/force-systemd-env-631273/apiserver.key.3ac177e2
	I1219 07:07:20.601286 2259035 crypto.go:68] Generating cert /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/force-systemd-env-631273/apiserver.crt.3ac177e2 with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.168.85.2]
	I1219 07:07:21.204046 2259035 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/force-systemd-env-631273/apiserver.crt.3ac177e2 ...
	I1219 07:07:21.204077 2259035 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/force-systemd-env-631273/apiserver.crt.3ac177e2: {Name:mkd19f6c4aa872a66afaa5f40c81337765816dc6 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1219 07:07:21.204252 2259035 crypto.go:164] Writing key to /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/force-systemd-env-631273/apiserver.key.3ac177e2 ...
	I1219 07:07:21.204271 2259035 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/force-systemd-env-631273/apiserver.key.3ac177e2: {Name:mk5de7a26025010a34df844cbcf72aa6f6055744 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1219 07:07:21.204343 2259035 certs.go:382] copying /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/force-systemd-env-631273/apiserver.crt.3ac177e2 -> /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/force-systemd-env-631273/apiserver.crt
	I1219 07:07:21.204433 2259035 certs.go:386] copying /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/force-systemd-env-631273/apiserver.key.3ac177e2 -> /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/force-systemd-env-631273/apiserver.key
	I1219 07:07:21.204492 2259035 certs.go:364] generating signed profile cert for "aggregator": /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/force-systemd-env-631273/proxy-client.key
	I1219 07:07:21.204511 2259035 crypto.go:68] Generating cert /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/force-systemd-env-631273/proxy-client.crt with IP's: []
	I1219 07:07:21.263520 2259035 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/force-systemd-env-631273/proxy-client.crt ...
	I1219 07:07:21.263553 2259035 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/force-systemd-env-631273/proxy-client.crt: {Name:mkc68c5b82c79db280c1a384924cb07d1f508be0 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1219 07:07:21.263722 2259035 crypto.go:164] Writing key to /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/force-systemd-env-631273/proxy-client.key ...
	I1219 07:07:21.263736 2259035 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/force-systemd-env-631273/proxy-client.key: {Name:mke9e9cb73b23fdc6ee5b14a4305287f0dea8986 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1219 07:07:21.263822 2259035 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I1219 07:07:21.263844 2259035 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I1219 07:07:21.263857 2259035 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I1219 07:07:21.263873 2259035 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I1219 07:07:21.263884 2259035 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/force-systemd-env-631273/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I1219 07:07:21.263902 2259035 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/force-systemd-env-631273/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I1219 07:07:21.263917 2259035 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/force-systemd-env-631273/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I1219 07:07:21.263932 2259035 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/force-systemd-env-631273/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I1219 07:07:21.263987 2259035 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/2000386.pem (1338 bytes)
	W1219 07:07:21.264028 2259035 certs.go:480] ignoring /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/2000386_empty.pem, impossibly tiny 0 bytes
	I1219 07:07:21.264041 2259035 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca-key.pem (1679 bytes)
	I1219 07:07:21.264069 2259035 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/ca.pem (1078 bytes)
	I1219 07:07:21.264099 2259035 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/cert.pem (1123 bytes)
	I1219 07:07:21.264127 2259035 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/key.pem (1671 bytes)
	I1219 07:07:21.264179 2259035 certs.go:484] found cert: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem (1708 bytes)
	I1219 07:07:21.264213 2259035 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem -> /usr/share/ca-certificates/20003862.pem
	I1219 07:07:21.264230 2259035 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I1219 07:07:21.264242 2259035 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/2000386.pem -> /usr/share/ca-certificates/2000386.pem
	I1219 07:07:21.264809 2259035 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1219 07:07:21.282881 2259035 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1219 07:07:21.301333 2259035 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1219 07:07:21.320382 2259035 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I1219 07:07:21.338201 2259035 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/force-systemd-env-631273/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1436 bytes)
	I1219 07:07:21.355885 2259035 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/force-systemd-env-631273/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1219 07:07:21.373416 2259035 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/force-systemd-env-631273/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1219 07:07:21.391038 2259035 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/force-systemd-env-631273/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1219 07:07:21.412675 2259035 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/ssl/certs/20003862.pem --> /usr/share/ca-certificates/20003862.pem (1708 bytes)
	I1219 07:07:21.433593 2259035 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1219 07:07:21.453885 2259035 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22230-1998525/.minikube/certs/2000386.pem --> /usr/share/ca-certificates/2000386.pem (1338 bytes)
	I1219 07:07:21.473468 2259035 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (722 bytes)
	I1219 07:07:21.486985 2259035 ssh_runner.go:195] Run: openssl version
	I1219 07:07:21.493478 2259035 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/20003862.pem
	I1219 07:07:21.501615 2259035 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/20003862.pem /etc/ssl/certs/20003862.pem
	I1219 07:07:21.509690 2259035 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/20003862.pem
	I1219 07:07:21.513789 2259035 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec 19 05:57 /usr/share/ca-certificates/20003862.pem
	I1219 07:07:21.513903 2259035 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/20003862.pem
	I1219 07:07:21.555117 2259035 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/3ec20f2e.0
	I1219 07:07:21.562972 2259035 ssh_runner.go:195] Run: sudo ln -fs /etc/ssl/certs/20003862.pem /etc/ssl/certs/3ec20f2e.0
	I1219 07:07:21.570677 2259035 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/minikubeCA.pem
	I1219 07:07:21.578185 2259035 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem
	I1219 07:07:21.585877 2259035 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1219 07:07:21.589827 2259035 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec 19 05:43 /usr/share/ca-certificates/minikubeCA.pem
	I1219 07:07:21.589931 2259035 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1219 07:07:21.630845 2259035 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/b5213941.0
	I1219 07:07:21.638386 2259035 ssh_runner.go:195] Run: sudo ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0
	I1219 07:07:21.645871 2259035 ssh_runner.go:195] Run: sudo test -s /usr/share/ca-certificates/2000386.pem
	I1219 07:07:21.653277 2259035 ssh_runner.go:195] Run: sudo ln -fs /usr/share/ca-certificates/2000386.pem /etc/ssl/certs/2000386.pem
	I1219 07:07:21.660696 2259035 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/2000386.pem
	I1219 07:07:21.664448 2259035 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec 19 05:57 /usr/share/ca-certificates/2000386.pem
	I1219 07:07:21.664538 2259035 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/2000386.pem
	I1219 07:07:21.705584 2259035 ssh_runner.go:195] Run: sudo test -L /etc/ssl/certs/51391683.0
	I1219 07:07:21.713158 2259035 ssh_runner.go:195] Run: sudo ln -fs /etc/ssl/certs/2000386.pem /etc/ssl/certs/51391683.0
	I1219 07:07:21.720497 2259035 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1219 07:07:21.724276 2259035 certs.go:400] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I1219 07:07:21.724332 2259035 kubeadm.go:401] StartCluster: {Name:force-systemd-env-631273 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:force-systemd-env-631273 Namespace:default APIServerHAVIP: APIServerName:min
ikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.85.2 Port:8443 KubernetesVersion:v1.34.3 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP:
SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1219 07:07:21.724413 2259035 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1219 07:07:21.724477 2259035 ssh_runner.go:195] Run: sudo -s eval "crictl --timeout=10s ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1219 07:07:21.753881 2259035 cri.go:92] found id: ""
	I1219 07:07:21.754019 2259035 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1219 07:07:21.761704 2259035 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1219 07:07:21.769576 2259035 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1219 07:07:21.769685 2259035 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1219 07:07:21.777344 2259035 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1219 07:07:21.777364 2259035 kubeadm.go:158] found existing configuration files:
	
	I1219 07:07:21.777417 2259035 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I1219 07:07:21.784867 2259035 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1219 07:07:21.784961 2259035 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1219 07:07:21.792383 2259035 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I1219 07:07:21.800086 2259035 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1219 07:07:21.800190 2259035 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1219 07:07:21.807499 2259035 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I1219 07:07:21.815193 2259035 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1219 07:07:21.815314 2259035 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1219 07:07:21.822934 2259035 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I1219 07:07:21.830719 2259035 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1219 07:07:21.830787 2259035 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1219 07:07:21.838471 2259035 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.34.3:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1219 07:07:21.909195 2259035 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is in maintenance mode, please migrate to cgroups v2
	I1219 07:07:21.909501 2259035 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1219 07:07:21.983672 2259035 kubeadm.go:319] 	[WARNING Service-Kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1219 07:07:28.602406 2212400 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.000904725s
	I1219 07:07:28.602457 2212400 kubeadm.go:319] 
	I1219 07:07:28.602516 2212400 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1219 07:07:28.602552 2212400 kubeadm.go:319] 	- The kubelet is not running
	I1219 07:07:28.602665 2212400 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1219 07:07:28.602672 2212400 kubeadm.go:319] 
	I1219 07:07:28.602776 2212400 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1219 07:07:28.602807 2212400 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1219 07:07:28.602838 2212400 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1219 07:07:28.602842 2212400 kubeadm.go:319] 
	I1219 07:07:28.606599 2212400 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1219 07:07:28.607025 2212400 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1219 07:07:28.607132 2212400 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1219 07:07:28.607366 2212400 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1219 07:07:28.607373 2212400 kubeadm.go:319] 
	I1219 07:07:28.607441 2212400 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1219 07:07:28.607493 2212400 kubeadm.go:403] duration metric: took 12m18.671397417s to StartCluster
	I1219 07:07:28.607527 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1219 07:07:28.607587 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-apiserver
	I1219 07:07:28.634525 2212400 cri.go:92] found id: ""
	I1219 07:07:28.634558 2212400 logs.go:282] 0 containers: []
	W1219 07:07:28.634568 2212400 logs.go:284] No container was found matching "kube-apiserver"
	I1219 07:07:28.634574 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1219 07:07:28.634633 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=etcd
	I1219 07:07:28.674733 2212400 cri.go:92] found id: ""
	I1219 07:07:28.674756 2212400 logs.go:282] 0 containers: []
	W1219 07:07:28.674765 2212400 logs.go:284] No container was found matching "etcd"
	I1219 07:07:28.674771 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1219 07:07:28.674836 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=coredns
	I1219 07:07:28.722927 2212400 cri.go:92] found id: ""
	I1219 07:07:28.722948 2212400 logs.go:282] 0 containers: []
	W1219 07:07:28.722957 2212400 logs.go:284] No container was found matching "coredns"
	I1219 07:07:28.722963 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1219 07:07:28.723032 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-scheduler
	I1219 07:07:28.767653 2212400 cri.go:92] found id: ""
	I1219 07:07:28.767675 2212400 logs.go:282] 0 containers: []
	W1219 07:07:28.767684 2212400 logs.go:284] No container was found matching "kube-scheduler"
	I1219 07:07:28.767691 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1219 07:07:28.767752 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-proxy
	I1219 07:07:28.797822 2212400 cri.go:92] found id: ""
	I1219 07:07:28.797844 2212400 logs.go:282] 0 containers: []
	W1219 07:07:28.797852 2212400 logs.go:284] No container was found matching "kube-proxy"
	I1219 07:07:28.797864 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1219 07:07:28.797922 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kube-controller-manager
	I1219 07:07:28.826985 2212400 cri.go:92] found id: ""
	I1219 07:07:28.827007 2212400 logs.go:282] 0 containers: []
	W1219 07:07:28.827016 2212400 logs.go:284] No container was found matching "kube-controller-manager"
	I1219 07:07:28.827023 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1219 07:07:28.827082 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=kindnet
	I1219 07:07:28.863020 2212400 cri.go:92] found id: ""
	I1219 07:07:28.863096 2212400 logs.go:282] 0 containers: []
	W1219 07:07:28.863128 2212400 logs.go:284] No container was found matching "kindnet"
	I1219 07:07:28.863148 2212400 cri.go:57] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1219 07:07:28.863238 2212400 ssh_runner.go:195] Run: sudo crictl --timeout=10s ps -a --quiet --name=storage-provisioner
	I1219 07:07:28.891494 2212400 cri.go:92] found id: ""
	I1219 07:07:28.891516 2212400 logs.go:282] 0 containers: []
	W1219 07:07:28.891524 2212400 logs.go:284] No container was found matching "storage-provisioner"
	I1219 07:07:28.891536 2212400 logs.go:123] Gathering logs for dmesg ...
	I1219 07:07:28.891548 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1219 07:07:28.937806 2212400 logs.go:123] Gathering logs for describe nodes ...
	I1219 07:07:28.937892 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1219 07:07:29.048357 2212400 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1219 07:07:29.048428 2212400 logs.go:123] Gathering logs for containerd ...
	I1219 07:07:29.048443 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1219 07:07:29.099207 2212400 logs.go:123] Gathering logs for container status ...
	I1219 07:07:29.099289 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1219 07:07:29.143691 2212400 logs.go:123] Gathering logs for kubelet ...
	I1219 07:07:29.143715 2212400 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	W1219 07:07:29.208467 2212400 out.go:434] Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000904725s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	W1219 07:07:29.208575 2212400 out.go:285] * 
	W1219 07:07:29.208658 2212400 out.go:285] X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000904725s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1219 07:07:29.208712 2212400 out.go:285] * 
	W1219 07:07:29.210920 2212400 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1219 07:07:29.216145 2212400 out.go:203] 
	W1219 07:07:29.219984 2212400 out.go:285] X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-rc.1:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-rc.1
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000904725s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1219 07:07:29.220090 2212400 out.go:285] * Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	W1219 07:07:29.220149 2212400 out.go:285] * Related issue: https://github.com/kubernetes/minikube/issues/4172
	I1219 07:07:29.223701 2212400 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 19 06:59:22 kubernetes-upgrade-352421 containerd[557]: time="2025-12-19T06:59:22.929734285Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 19 06:59:22 kubernetes-upgrade-352421 containerd[557]: time="2025-12-19T06:59:22.931131714Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.13.1\" with image id \"sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf\", repo tag \"registry.k8s.io/coredns/coredns:v1.13.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6\", size \"21168808\" in 1.355502189s"
	Dec 19 06:59:22 kubernetes-upgrade-352421 containerd[557]: time="2025-12-19T06:59:22.931176498Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.13.1\" returns image reference \"sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf\""
	Dec 19 06:59:22 kubernetes-upgrade-352421 containerd[557]: time="2025-12-19T06:59:22.932552643Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\""
	Dec 19 06:59:23 kubernetes-upgrade-352421 containerd[557]: time="2025-12-19T06:59:23.568078342Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}"
	Dec 19 06:59:23 kubernetes-upgrade-352421 containerd[557]: time="2025-12-19T06:59:23.569914036Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10.1: active requests=0, bytes read=268709"
	Dec 19 06:59:23 kubernetes-upgrade-352421 containerd[557]: time="2025-12-19T06:59:23.572350344Z" level=info msg="ImageCreate event name:\"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}"
	Dec 19 06:59:23 kubernetes-upgrade-352421 containerd[557]: time="2025-12-19T06:59:23.576178280Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}"
	Dec 19 06:59:23 kubernetes-upgrade-352421 containerd[557]: time="2025-12-19T06:59:23.577155431Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10.1\" with image id \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\", repo tag \"registry.k8s.io/pause:3.10.1\", repo digest \"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\", size \"267939\" in 644.523459ms"
	Dec 19 06:59:23 kubernetes-upgrade-352421 containerd[557]: time="2025-12-19T06:59:23.577285705Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\" returns image reference \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\""
	Dec 19 06:59:23 kubernetes-upgrade-352421 containerd[557]: time="2025-12-19T06:59:23.578289301Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.6-0\""
	Dec 19 06:59:25 kubernetes-upgrade-352421 containerd[557]: time="2025-12-19T06:59:25.175289734Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.6-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 19 06:59:25 kubernetes-upgrade-352421 containerd[557]: time="2025-12-19T06:59:25.177138549Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.6-0: active requests=0, bytes read=21753021"
	Dec 19 06:59:25 kubernetes-upgrade-352421 containerd[557]: time="2025-12-19T06:59:25.179687252Z" level=info msg="ImageCreate event name:\"sha256:271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 19 06:59:25 kubernetes-upgrade-352421 containerd[557]: time="2025-12-19T06:59:25.183740003Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Dec 19 06:59:25 kubernetes-upgrade-352421 containerd[557]: time="2025-12-19T06:59:25.184880740Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.6-0\" with image id \"sha256:271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57\", repo tag \"registry.k8s.io/etcd:3.6.6-0\", repo digest \"registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890\", size \"21749640\" in 1.606554426s"
	Dec 19 06:59:25 kubernetes-upgrade-352421 containerd[557]: time="2025-12-19T06:59:25.185014920Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.6-0\" returns image reference \"sha256:271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57\""
	Dec 19 07:04:15 kubernetes-upgrade-352421 containerd[557]: time="2025-12-19T07:04:15.445267313Z" level=info msg="container event discarded" container=77f94e0a02c8e67ea94fdba8911ff2ec2c0659102a552218cb63ed0d429d26ea type=CONTAINER_DELETED_EVENT
	Dec 19 07:04:15 kubernetes-upgrade-352421 containerd[557]: time="2025-12-19T07:04:15.460547116Z" level=info msg="container event discarded" container=e2942d2ff7e0ce5a52f9ec3f9a33c7428d332418d0451dee8739a40a8e66e0a1 type=CONTAINER_DELETED_EVENT
	Dec 19 07:04:15 kubernetes-upgrade-352421 containerd[557]: time="2025-12-19T07:04:15.470959472Z" level=info msg="container event discarded" container=a120dd410509cbe970f6e58cccbf17d9454a4de237df53a5883541d631b2b207 type=CONTAINER_DELETED_EVENT
	Dec 19 07:04:15 kubernetes-upgrade-352421 containerd[557]: time="2025-12-19T07:04:15.471012150Z" level=info msg="container event discarded" container=fe72ce9fb88be2159011721fac7e7deb0525a086340fc9d3b2325d65971a04c0 type=CONTAINER_DELETED_EVENT
	Dec 19 07:04:15 kubernetes-upgrade-352421 containerd[557]: time="2025-12-19T07:04:15.487381679Z" level=info msg="container event discarded" container=d7cb0d92b700cc8d05c1496ee986ae79bb301ad2b114a08f942bcc577d6337d5 type=CONTAINER_DELETED_EVENT
	Dec 19 07:04:15 kubernetes-upgrade-352421 containerd[557]: time="2025-12-19T07:04:15.487435907Z" level=info msg="container event discarded" container=5282fa1377d407ead3e37424d6307a3076ea4111a48da258bf7da28c8dc39805 type=CONTAINER_DELETED_EVENT
	Dec 19 07:04:15 kubernetes-upgrade-352421 containerd[557]: time="2025-12-19T07:04:15.503835484Z" level=info msg="container event discarded" container=178dfe3a74c6291056be4141128ad2b1b44040d9943f933d57d177965212d6fb type=CONTAINER_DELETED_EVENT
	Dec 19 07:04:15 kubernetes-upgrade-352421 containerd[557]: time="2025-12-19T07:04:15.503874048Z" level=info msg="container event discarded" container=be0ae7c2dcd61b8d7bae609b3bca7be4e26072e17a0ee4c365d967441948a42f type=CONTAINER_DELETED_EVENT
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-rc.1/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec19 04:47] overlayfs: idmapped layers are currently not supported
	[Dec19 04:48] overlayfs: idmapped layers are currently not supported
	[Dec19 04:49] overlayfs: idmapped layers are currently not supported
	[Dec19 04:51] overlayfs: idmapped layers are currently not supported
	[Dec19 04:53] overlayfs: idmapped layers are currently not supported
	[Dec19 05:03] overlayfs: idmapped layers are currently not supported
	[Dec19 05:04] overlayfs: idmapped layers are currently not supported
	[Dec19 05:05] overlayfs: idmapped layers are currently not supported
	[Dec19 05:06] overlayfs: idmapped layers are currently not supported
	[ +12.793339] overlayfs: idmapped layers are currently not supported
	[Dec19 05:07] overlayfs: idmapped layers are currently not supported
	[Dec19 05:08] overlayfs: idmapped layers are currently not supported
	[Dec19 05:09] overlayfs: idmapped layers are currently not supported
	[Dec19 05:10] overlayfs: idmapped layers are currently not supported
	[Dec19 05:11] overlayfs: idmapped layers are currently not supported
	[Dec19 05:13] overlayfs: idmapped layers are currently not supported
	[Dec19 05:14] overlayfs: idmapped layers are currently not supported
	[Dec19 05:32] overlayfs: idmapped layers are currently not supported
	[Dec19 05:33] overlayfs: idmapped layers are currently not supported
	[Dec19 05:35] overlayfs: idmapped layers are currently not supported
	[Dec19 05:36] overlayfs: idmapped layers are currently not supported
	[Dec19 05:38] overlayfs: idmapped layers are currently not supported
	[Dec19 05:39] overlayfs: idmapped layers are currently not supported
	[Dec19 05:40] overlayfs: idmapped layers are currently not supported
	[Dec19 05:42] kauditd_printk_skb: 8 callbacks suppressed
	
	
	==> kernel <==
	 07:07:31 up 11:49,  0 user,  load average: 1.88, 1.43, 1.71
	Linux kubernetes-upgrade-352421 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 19 07:07:28 kubernetes-upgrade-352421 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 07:07:28 kubernetes-upgrade-352421 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 320.
	Dec 19 07:07:28 kubernetes-upgrade-352421 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 07:07:28 kubernetes-upgrade-352421 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 07:07:28 kubernetes-upgrade-352421 kubelet[14555]: E1219 07:07:28.991561   14555 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 07:07:29 kubernetes-upgrade-352421 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 07:07:29 kubernetes-upgrade-352421 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 07:07:29 kubernetes-upgrade-352421 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 321.
	Dec 19 07:07:29 kubernetes-upgrade-352421 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 07:07:29 kubernetes-upgrade-352421 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 07:07:29 kubernetes-upgrade-352421 kubelet[14581]: E1219 07:07:29.762254   14581 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 07:07:29 kubernetes-upgrade-352421 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 07:07:29 kubernetes-upgrade-352421 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 07:07:30 kubernetes-upgrade-352421 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 322.
	Dec 19 07:07:30 kubernetes-upgrade-352421 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 07:07:30 kubernetes-upgrade-352421 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 07:07:30 kubernetes-upgrade-352421 kubelet[14586]: E1219 07:07:30.527577   14586 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 07:07:30 kubernetes-upgrade-352421 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 07:07:30 kubernetes-upgrade-352421 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 19 07:07:31 kubernetes-upgrade-352421 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 323.
	Dec 19 07:07:31 kubernetes-upgrade-352421 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 07:07:31 kubernetes-upgrade-352421 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 19 07:07:31 kubernetes-upgrade-352421 kubelet[14608]: E1219 07:07:31.229276   14608 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 19 07:07:31 kubernetes-upgrade-352421 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 19 07:07:31 kubernetes-upgrade-352421 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:263: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p kubernetes-upgrade-352421 -n kubernetes-upgrade-352421
helpers_test.go:263: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p kubernetes-upgrade-352421 -n kubernetes-upgrade-352421: exit status 2 (421.338499ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:263: status error: exit status 2 (may be ok)
helpers_test.go:265: "kubernetes-upgrade-352421" apiserver is not running, skipping kubectl commands (state="Stopped")
helpers_test.go:176: Cleaning up "kubernetes-upgrade-352421" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p kubernetes-upgrade-352421
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p kubernetes-upgrade-352421: (2.488356759s)
--- FAIL: TestKubernetesUpgrade (800.57s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/SecondStart (7200.08s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/SecondStart
start_stop_delete_test.go:254: (dbg) Run:  out/minikube-linux-arm64 start -p default-k8s-diff-port-042036 --memory=3072 --alsologtostderr --wait=true --apiserver-port=8444 --driver=docker  --container-runtime=containerd --kubernetes-version=v1.34.3
panic: test timed out after 2h0m0s
	running tests:
		TestNetworkPlugins (36m8s)
		TestStartStop (37m46s)
		TestStartStop/group/default-k8s-diff-port (1m22s)
		TestStartStop/group/default-k8s-diff-port/serial (1m22s)
		TestStartStop/group/default-k8s-diff-port/serial/SecondStart (3s)
		TestStartStop/group/embed-certs (14m18s)
		TestStartStop/group/embed-certs/serial (14m18s)
		TestStartStop/group/embed-certs/serial/AddonExistsAfterStop (3m7s)

                                                
                                                
goroutine 7618 [running]:
testing.(*M).startAlarm.func1()
	/usr/local/go/src/testing/testing.go:2682 +0x2b0
created by time.goFunc
	/usr/local/go/src/time/sleep.go:215 +0x38

                                                
                                                
goroutine 1 [chan receive, 31 minutes]:
testing.tRunner.func1()
	/usr/local/go/src/testing/testing.go:1891 +0x3d0
testing.tRunner(0x400039e700, 0x40008fbbb8)
	/usr/local/go/src/testing/testing.go:1940 +0x104
testing.runTests(0x400076a180, {0x540d780, 0x2c, 0x2c}, {0x40008fbd08?, 0x125774?, 0x54366c0?})
	/usr/local/go/src/testing/testing.go:2475 +0x3b8
testing.(*M).Run(0x40002f3b80)
	/usr/local/go/src/testing/testing.go:2337 +0x530
k8s.io/minikube/test/integration.TestMain(0x40002f3b80)
	/home/jenkins/workspace/Build_Cross/test/integration/main_test.go:64 +0xf0
main.main()
	_testmain.go:133 +0x88

                                                
                                                
goroutine 5193 [chan receive, 3 minutes]:
testing.(*T).Run(0x40006f3dc0, {0x2a0ec1c?, 0x40000006ee?}, 0x40016aa080)
	/usr/local/go/src/testing/testing.go:2005 +0x378
k8s.io/minikube/test/integration.TestStartStop.func1.1.1(0x40006f3dc0)
	/home/jenkins/workspace/Build_Cross/test/integration/start_stop_delete_test.go:153 +0x1b8
testing.tRunner(0x40006f3dc0, 0x40015e0700)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 3504
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 155 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x3769e10, 0x40001061c0}, 0x40014bff40, 0x40014bff88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x3769e10, 0x40001061c0}, 0x84?, 0x40014bff40, 0x40014bff88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x3769e10?, 0x40001061c0?}, 0x4000103a40?, 0x4000096608?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x0?, 0x95c64?, 0x4004ee0540?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 164
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 3582 [chan receive, 36 minutes]:
testing.(*testState).waitParallel(0x40005965f0)
	/usr/local/go/src/testing/testing.go:2116 +0x158
testing.(*T).Parallel(0x4001c93880)
	/usr/local/go/src/testing/testing.go:1709 +0x19c
k8s.io/minikube/test/integration.MaybeParallel(0x4001c93880)
	/home/jenkins/workspace/Build_Cross/test/integration/helpers_test.go:501 +0x5c
k8s.io/minikube/test/integration.TestNetworkPlugins.func1.1(0x4001c93880)
	/home/jenkins/workspace/Build_Cross/test/integration/net_test.go:106 +0x2c0
testing.tRunner(0x4001c93880, 0x400040e700)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 3581
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 3834 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x37831e0, {{0x37776d0, 0x4000222040?}, 0x400022a780?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 3830
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 164 [chan receive, 117 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x4004f06720, 0x40001061c0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 153
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 1354 [IO wait, 105 minutes]:
internal/poll.runtime_pollWait(0xffff4c0d4600, 0x72)
	/usr/local/go/src/runtime/netpoll.go:351 +0xa0
internal/poll.(*pollDesc).wait(0x400040ea80?, 0x2d970?, 0x0)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:84 +0x28
internal/poll.(*pollDesc).waitRead(...)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Accept(0x400040ea80)
	/usr/local/go/src/internal/poll/fd_unix.go:613 +0x21c
net.(*netFD).accept(0x400040ea80)
	/usr/local/go/src/net/fd_unix.go:161 +0x28
net.(*TCPListener).accept(0x4001ad4480)
	/usr/local/go/src/net/tcpsock_posix.go:159 +0x24
net.(*TCPListener).Accept(0x4001ad4480)
	/usr/local/go/src/net/tcpsock.go:380 +0x2c
net/http.(*Server).Serve(0x40006f6100, {0x3756d20, 0x4001ad4480})
	/usr/local/go/src/net/http/server.go:3463 +0x24c
net/http.(*Server).ListenAndServe(0x40006f6100)
	/usr/local/go/src/net/http/server.go:3389 +0x80
k8s.io/minikube/test/integration.startHTTPProxy.func1(...)
	/home/jenkins/workspace/Build_Cross/test/integration/functional_test.go:2218
created by k8s.io/minikube/test/integration.startHTTPProxy in goroutine 1352
	/home/jenkins/workspace/Build_Cross/test/integration/functional_test.go:2217 +0x104

                                                
                                                
goroutine 163 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x37831e0, {{0x37776d0, 0x4000222040?}, 0x400072af00?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 153
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 154 [sync.Cond.Wait, 3 minutes]:
sync.runtime_notifyListWait(0x4004e8a290, 0x2d)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x4004e8a280)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3786dc0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/util/workqueue/queue.go:260 +0x84
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x4004f06720)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x400017a690?, 0x0?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x3769e10?, 0x40001061c0?}, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x3769e10, 0x40001061c0}, 0x40013e1f38, {0x3721cc0, 0x40006fa6f0}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x37776d0?, {0x3721cc0?, 0x40006fa6f0?}, 0x20?, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x40006efa60, 0x3b9aca00, 0x0, 0x1, 0x40001061c0)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 164
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 156 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 155
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 7536 [sync.Cond.Wait]:
sync.runtime_notifyListWait(0x400077a1d0, 0x0)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x400077a1c0)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3786dc0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/util/workqueue/queue.go:260 +0x84
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x400184e300)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x4001a74540?, 0x21dd4?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x3769e10?, 0x40001061c0?}, 0x40012ce6a8?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x3769e10, 0x40001061c0}, 0x400138af38, {0x3721cc0, 0x40015b91d0}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x11?, {0x3721cc0?, 0x40015b91d0?}, 0x7c?, 0x400072bb00?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x4001771ad0, 0x3b9aca00, 0x0, 0x1, 0x40001061c0)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 7546
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 908 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x3769e10, 0x40001061c0}, 0x40012d2740, 0x400010cf88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x3769e10, 0x40001061c0}, 0x18?, 0x40012d2740, 0x40012d2788)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x3769e10?, 0x40001061c0?}, 0x0?, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x0?, 0x95c64?, 0x400072bb00?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 916
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 7610 [syscall]:
syscall.Syscall6(0x5f, 0x3, 0x12, 0x400142eb18, 0x4, 0x4000088900, 0x0)
	/usr/local/go/src/syscall/syscall_linux.go:96 +0x2c
internal/syscall/unix.Waitid(0x400142ec78?, 0x1929a0?, 0xffffd8520145?, 0x0?, 0x40014dd140?)
	/usr/local/go/src/internal/syscall/unix/waitid_linux.go:18 +0x44
os.(*Process).pidfdWait.func1(...)
	/usr/local/go/src/os/pidfd_linux.go:109
os.ignoringEINTR(...)
	/usr/local/go/src/os/file_posix.go:256
os.(*Process).pidfdWait(0x40007080c0)
	/usr/local/go/src/os/pidfd_linux.go:108 +0x144
os.(*Process).wait(0x400142ec48?)
	/usr/local/go/src/os/exec_unix.go:25 +0x24
os.(*Process).Wait(...)
	/usr/local/go/src/os/exec.go:340
os/exec.(*Cmd).Wait(0x400154d080)
	/usr/local/go/src/os/exec/exec.go:922 +0x38
os/exec.(*Cmd).Run(0x400154d080)
	/usr/local/go/src/os/exec/exec.go:626 +0x38
k8s.io/minikube/test/integration.Run(0x400152f6c0, 0x400154d080)
	/home/jenkins/workspace/Build_Cross/test/integration/helpers_test.go:104 +0x154
k8s.io/minikube/test/integration.validateSecondStart({0x3769a88, 0x40003482a0}, 0x400152f6c0, {0x400147b140, 0x1c}, {0x1dc02d94?, 0x1dc02d9400161e84?}, {0x69450205?, 0x400142ef58?}, {0x40006f6600?, ...})
	/home/jenkins/workspace/Build_Cross/test/integration/start_stop_delete_test.go:254 +0x90
k8s.io/minikube/test/integration.TestStartStop.func1.1.1.1(0x400152f6c0?)
	/home/jenkins/workspace/Build_Cross/test/integration/start_stop_delete_test.go:154 +0x44
testing.tRunner(0x400152f6c0, 0x40015e1800)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 7482
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 915 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x37831e0, {{0x37776d0, 0x4000222040?}, 0x40013a8e00?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 914
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 1059 [chan send, 107 minutes]:
os/exec.(*Cmd).watchCtx(0x400180bb00, 0x400182f500)
	/usr/local/go/src/os/exec/exec.go:814 +0x280
created by os/exec.(*Cmd).Start in goroutine 1058
	/usr/local/go/src/os/exec/exec.go:775 +0x678

                                                
                                                
goroutine 5044 [sync.Cond.Wait, 5 minutes]:
sync.runtime_notifyListWait(0x4001602650, 0x2)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x4001602640)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3786dc0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/util/workqueue/queue.go:260 +0x84
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x400177ea20)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x400035f260?, 0x0?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x3769e10?, 0x40001061c0?}, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x3769e10, 0x40001061c0}, 0x4000109f38, {0x3721cc0, 0x40017fc8d0}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x37776d0?, {0x3721cc0?, 0x40017fc8d0?}, 0x70?, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x40018913e0, 0x3b9aca00, 0x0, 0x1, 0x40001061c0)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 5041
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 3498 [chan receive, 14 minutes]:
testing.tRunner.func1()
	/usr/local/go/src/testing/testing.go:1891 +0x3d0
testing.tRunner(0x40017eaa80, 0x34197e8)
	/usr/local/go/src/testing/testing.go:1940 +0x104
created by testing.(*T).Run in goroutine 3335
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 3835 [chan receive, 35 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x40014a59e0, 0x40001061c0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 3830
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 7554 [select]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 7553
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 907 [sync.Cond.Wait, 3 minutes]:
sync.runtime_notifyListWait(0x40014d99d0, 0x2b)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x40014d99c0)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3786dc0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/util/workqueue/queue.go:260 +0x84
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x400167a360)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x4000314850?, 0x0?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x3769e10?, 0x40001061c0?}, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x3769e10, 0x40001061c0}, 0x40014c0f38, {0x3721cc0, 0x40015c8840}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x37776d0?, {0x3721cc0?, 0x40015c8840?}, 0x20?, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x40016c5b30, 0x3b9aca00, 0x0, 0x1, 0x40001061c0)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 916
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 1158 [chan send, 107 minutes]:
os/exec.(*Cmd).watchCtx(0x4001aa5e00, 0x4001aa7730)
	/usr/local/go/src/os/exec/exec.go:814 +0x280
created by os/exec.(*Cmd).Start in goroutine 842
	/usr/local/go/src/os/exec/exec.go:775 +0x678

                                                
                                                
goroutine 3817 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 3816
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 3246 [chan receive, 36 minutes]:
testing.(*T).Run(0x40006f2540, {0x29e7e9a?, 0x25b5890ac5ab?}, 0x40017e8db0)
	/usr/local/go/src/testing/testing.go:2005 +0x378
k8s.io/minikube/test/integration.TestNetworkPlugins(0x40006f2540)
	/home/jenkins/workspace/Build_Cross/test/integration/net_test.go:52 +0xe4
testing.tRunner(0x40006f2540, 0x34195b8)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 1
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 1964 [chan send, 76 minutes]:
os/exec.(*Cmd).watchCtx(0x400072b800, 0x40016651f0)
	/usr/local/go/src/os/exec/exec.go:814 +0x280
created by os/exec.(*Cmd).Start in goroutine 1962
	/usr/local/go/src/os/exec/exec.go:775 +0x678

                                                
                                                
goroutine 3581 [chan receive, 36 minutes]:
testing.tRunner.func1()
	/usr/local/go/src/testing/testing.go:1891 +0x3d0
testing.tRunner(0x4001c936c0, 0x40017e8db0)
	/usr/local/go/src/testing/testing.go:1940 +0x104
created by testing.(*T).Run in goroutine 3246
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 5503 [IO wait]:
internal/poll.runtime_pollWait(0xffff4c0d4400, 0x72)
	/usr/local/go/src/runtime/netpoll.go:351 +0xa0
internal/poll.(*pollDesc).wait(0x400156f100?, 0x40002a6400?, 0x0)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:84 +0x28
internal/poll.(*pollDesc).waitRead(...)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Read(0x400156f100, {0x40002a6400, 0xc00, 0xc00})
	/usr/local/go/src/internal/poll/fd_unix.go:165 +0x1e0
net.(*netFD).Read(0x400156f100, {0x40002a6400?, 0x4001388848?, 0x2a138?})
	/usr/local/go/src/net/fd_posix.go:68 +0x28
net.(*conn).Read(0x4000112b28, {0x40002a6400?, 0x3002a6400?, 0xffff4bebf400?})
	/usr/local/go/src/net/net.go:196 +0x34
crypto/tls.(*atLeastReader).Read(0x40014d4330, {0x40002a6400?, 0x4001388908?, 0x2cbb64?})
	/usr/local/go/src/crypto/tls/conn.go:816 +0x38
bytes.(*Buffer).ReadFrom(0x40013342a8, {0x3722400, 0x40014d4330})
	/usr/local/go/src/bytes/buffer.go:217 +0x90
crypto/tls.(*Conn).readFromUntil(0x4001334008, {0x3721840, 0x4000112b28}, 0x40013889b0?)
	/usr/local/go/src/crypto/tls/conn.go:838 +0xcc
crypto/tls.(*Conn).readRecordOrCCS(0x4001334008, 0x0)
	/usr/local/go/src/crypto/tls/conn.go:627 +0x340
crypto/tls.(*Conn).readRecord(...)
	/usr/local/go/src/crypto/tls/conn.go:589
crypto/tls.(*Conn).Read(0x4001334008, {0x40012e2000, 0x1000, 0x4000000000?})
	/usr/local/go/src/crypto/tls/conn.go:1392 +0x14c
bufio.(*Reader).Read(0x4004f07f20, {0x40001b0f24, 0x9, 0x52b380?})
	/usr/local/go/src/bufio/bufio.go:245 +0x188
io.ReadAtLeast({0x3720320, 0x4004f07f20}, {0x40001b0f24, 0x9, 0x9}, 0x9)
	/usr/local/go/src/io/io.go:335 +0x98
io.ReadFull(...)
	/usr/local/go/src/io/io.go:354
golang.org/x/net/http2.readFrameHeader({0x40001b0f24, 0x9, 0x40000005ad?}, {0x3720320?, 0x4004f07f20?})
	/home/jenkins/go/pkg/mod/golang.org/x/net@v0.47.0/http2/frame.go:242 +0x58
golang.org/x/net/http2.(*Framer).ReadFrameHeader(0x40001b0ee0)
	/home/jenkins/go/pkg/mod/golang.org/x/net@v0.47.0/http2/frame.go:505 +0x60
golang.org/x/net/http2.(*Framer).ReadFrame(0x40001b0ee0)
	/home/jenkins/go/pkg/mod/golang.org/x/net@v0.47.0/http2/frame.go:564 +0x20
golang.org/x/net/http2.(*clientConnReadLoop).run(0x4001388f98)
	/home/jenkins/go/pkg/mod/golang.org/x/net@v0.47.0/http2/transport.go:2208 +0xb8
golang.org/x/net/http2.(*ClientConn).readLoop(0x400152efc0)
	/home/jenkins/go/pkg/mod/golang.org/x/net@v0.47.0/http2/transport.go:2077 +0x4c
created by golang.org/x/net/http2.(*Transport).newClientConn in goroutine 5502
	/home/jenkins/go/pkg/mod/golang.org/x/net@v0.47.0/http2/transport.go:866 +0xa90

                                                
                                                
goroutine 3649 [chan receive, 36 minutes]:
testing.(*testState).waitParallel(0x40005965f0)
	/usr/local/go/src/testing/testing.go:2116 +0x158
testing.(*T).Parallel(0x4001c93c00)
	/usr/local/go/src/testing/testing.go:1709 +0x19c
k8s.io/minikube/test/integration.MaybeParallel(0x4001c93c00)
	/home/jenkins/workspace/Build_Cross/test/integration/helpers_test.go:501 +0x5c
k8s.io/minikube/test/integration.TestNetworkPlugins.func1.1(0x4001c93c00)
	/home/jenkins/workspace/Build_Cross/test/integration/net_test.go:106 +0x2c0
testing.tRunner(0x4001c93c00, 0x400040ed00)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 3581
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 7553 [select]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x3769e10, 0x40001061c0}, 0x40012e0740, 0x40012e0788)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x3769e10, 0x40001061c0}, 0xb?, 0x40012e0740, 0x40012e0788)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x3769e10?, 0x40001061c0?}, 0x0?, 0x40012e0750?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x37776d0?, 0x4000222040?, 0x400022ad80?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 7546
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 718 [IO wait, 113 minutes]:
internal/poll.runtime_pollWait(0xffff4c0d4800, 0x72)
	/usr/local/go/src/runtime/netpoll.go:351 +0xa0
internal/poll.(*pollDesc).wait(0x400040ed80?, 0x2d970?, 0x0)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:84 +0x28
internal/poll.(*pollDesc).waitRead(...)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Accept(0x400040ed80)
	/usr/local/go/src/internal/poll/fd_unix.go:613 +0x21c
net.(*netFD).accept(0x400040ed80)
	/usr/local/go/src/net/fd_unix.go:161 +0x28
net.(*TCPListener).accept(0x40014d84c0)
	/usr/local/go/src/net/tcpsock_posix.go:159 +0x24
net.(*TCPListener).Accept(0x40014d84c0)
	/usr/local/go/src/net/tcpsock.go:380 +0x2c
net/http.(*Server).Serve(0x40006f6300, {0x3756d20, 0x40014d84c0})
	/usr/local/go/src/net/http/server.go:3463 +0x24c
net/http.(*Server).ListenAndServe(0x40006f6300)
	/usr/local/go/src/net/http/server.go:3389 +0x80
k8s.io/minikube/test/integration.startHTTPProxy.func1(...)
	/home/jenkins/workspace/Build_Cross/test/integration/functional_test.go:2218
created by k8s.io/minikube/test/integration.startHTTPProxy in goroutine 716
	/home/jenkins/workspace/Build_Cross/test/integration/functional_test.go:2217 +0x104

                                                
                                                
goroutine 1953 [chan send, 76 minutes]:
os/exec.(*Cmd).watchCtx(0x400072af00, 0x4001664930)
	/usr/local/go/src/os/exec/exec.go:814 +0x280
created by os/exec.(*Cmd).Start in goroutine 1920
	/usr/local/go/src/os/exec/exec.go:775 +0x678

                                                
                                                
goroutine 7612 [IO wait]:
internal/poll.runtime_pollWait(0xffff4be46e00, 0x72)
	/usr/local/go/src/runtime/netpoll.go:351 +0xa0
internal/poll.(*pollDesc).wait(0x4001bc7ce0?, 0x400156a159?, 0x1)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:84 +0x28
internal/poll.(*pollDesc).waitRead(...)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Read(0x4001bc7ce0, {0x400156a159, 0x3ea7, 0x3ea7})
	/usr/local/go/src/internal/poll/fd_unix.go:165 +0x1e0
os.(*File).read(...)
	/usr/local/go/src/os/file_posix.go:29
os.(*File).Read(0x40001136e8, {0x400156a159?, 0x40012bd568?, 0x8b27c?})
	/usr/local/go/src/os/file.go:144 +0x68
bytes.(*Buffer).ReadFrom(0x40018aa060, {0x3720088, 0x40000a8b10})
	/usr/local/go/src/bytes/buffer.go:217 +0x90
io.copyBuffer({0x3720280, 0x40018aa060}, {0x3720088, 0x40000a8b10}, {0x0, 0x0, 0x0})
	/usr/local/go/src/io/io.go:415 +0x14c
io.Copy(...)
	/usr/local/go/src/io/io.go:388
os.genericWriteTo(0x40001136e8?, {0x3720280, 0x40018aa060})
	/usr/local/go/src/os/file.go:295 +0x58
os.(*File).WriteTo(0x40001136e8, {0x3720280, 0x40018aa060})
	/usr/local/go/src/os/file.go:273 +0x9c
io.copyBuffer({0x3720280, 0x40018aa060}, {0x3720108, 0x40001136e8}, {0x0, 0x0, 0x0})
	/usr/local/go/src/io/io.go:411 +0x98
io.Copy(...)
	/usr/local/go/src/io/io.go:388
os/exec.(*Cmd).writerDescriptor.func1()
	/usr/local/go/src/os/exec/exec.go:596 +0x40
os/exec.(*Cmd).Start.func2(0x40015aad80?)
	/usr/local/go/src/os/exec/exec.go:749 +0x30
created by os/exec.(*Cmd).Start in goroutine 7610
	/usr/local/go/src/os/exec/exec.go:748 +0x6a4

                                                
                                                
goroutine 3335 [chan receive, 38 minutes]:
testing.(*T).Run(0x40013a8c40, {0x29e7e9a?, 0x40013e5f58?}, 0x34197e8)
	/usr/local/go/src/testing/testing.go:2005 +0x378
k8s.io/minikube/test/integration.TestStartStop(0x40013a8c40)
	/home/jenkins/workspace/Build_Cross/test/integration/start_stop_delete_test.go:46 +0x3c
testing.tRunner(0x40013a8c40, 0x3419600)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 1
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 916 [chan receive, 107 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x400167a360, 0x40001061c0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 914
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 1602 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x3769e10, 0x40001061c0}, 0x400138df40, 0x400138df88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x3769e10, 0x40001061c0}, 0x84?, 0x400138df40, 0x400138df88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x3769e10?, 0x40001061c0?}, 0x0?, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x0?, 0x95c64?, 0x400022b680?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 1597
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 909 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 908
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 1601 [sync.Cond.Wait, 3 minutes]:
sync.runtime_notifyListWait(0x40014d8050, 0x23)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x40014d8040)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3786dc0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/util/workqueue/queue.go:260 +0x84
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x400167a0c0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x4000270e00?, 0x0?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x3769e10?, 0x40001061c0?}, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x3769e10, 0x40001061c0}, 0x400138cf38, {0x3721cc0, 0x40015b8240}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x37776d0?, {0x3721cc0?, 0x40015b8240?}, 0xc0?, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x400148ff20, 0x3b9aca00, 0x0, 0x1, 0x40001061c0)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 1597
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 3815 [sync.Cond.Wait, 3 minutes]:
sync.runtime_notifyListWait(0x4001602b90, 0x16)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x4001602b80)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3786dc0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/util/workqueue/queue.go:260 +0x84
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x40014a59e0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x4001399960?, 0x1618bc?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x3769e10?, 0x40001061c0?}, 0x40000a76a8?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x3769e10, 0x40001061c0}, 0x400010df38, {0x3721cc0, 0x4001314a50}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x40000a77a8?, {0x3721cc0?, 0x4001314a50?}, 0xe0?, 0x161f90?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x4001596b40, 0x3b9aca00, 0x0, 0x1, 0x40001061c0)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 3835
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 7146 [select]:
k8s.io/apimachinery/pkg/util/wait.loopConditionUntilContext({0x3769a88, 0x4001755260}, {0x37573e0, 0x4001c79b60}, 0x1, 0x0, 0x40012c9b00)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/loop.go:66 +0x158
k8s.io/apimachinery/pkg/util/wait.PollUntilContextTimeout({0x3769a88?, 0x4000314620?}, 0x3b9aca00, 0x40012c9d28?, 0x1, 0x40012c9b00)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/poll.go:48 +0x8c
k8s.io/minikube/test/integration.PodWait({0x3769a88, 0x4000314620}, 0x4001b37180, {0x4001a7cab0, 0x12}, {0x2a0ebcc, 0x14}, {0x2a26da4, 0x1c}, 0x7dba821800)
	/home/jenkins/workspace/Build_Cross/test/integration/helpers_test.go:380 +0x22c
k8s.io/minikube/test/integration.validateAddonAfterStop({0x3769a88, 0x4000314620}, 0x4001b37180, {0x4001a7cab0, 0x12}, {0x29f5046?, 0x2eaa219500161e84?}, {0x6945014d?, 0x4001443f58?}, {0x161f08?, ...})
	/home/jenkins/workspace/Build_Cross/test/integration/start_stop_delete_test.go:285 +0xd4
k8s.io/minikube/test/integration.TestStartStop.func1.1.1.1(0x4001b37180?)
	/home/jenkins/workspace/Build_Cross/test/integration/start_stop_delete_test.go:154 +0x44
testing.tRunner(0x4001b37180, 0x40016aa080)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 5193
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 3651 [chan receive, 36 minutes]:
testing.(*testState).waitParallel(0x40005965f0)
	/usr/local/go/src/testing/testing.go:2116 +0x158
testing.(*T).Parallel(0x40013a8e00)
	/usr/local/go/src/testing/testing.go:1709 +0x19c
k8s.io/minikube/test/integration.MaybeParallel(0x40013a8e00)
	/home/jenkins/workspace/Build_Cross/test/integration/helpers_test.go:501 +0x5c
k8s.io/minikube/test/integration.TestNetworkPlugins.func1.1(0x40013a8e00)
	/home/jenkins/workspace/Build_Cross/test/integration/net_test.go:106 +0x2c0
testing.tRunner(0x40013a8e00, 0x400040ee80)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 3581
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 3706 [chan receive, 36 minutes]:
testing.(*testState).waitParallel(0x40005965f0)
	/usr/local/go/src/testing/testing.go:2116 +0x158
testing.(*T).Parallel(0x4004ee0540)
	/usr/local/go/src/testing/testing.go:1709 +0x19c
k8s.io/minikube/test/integration.MaybeParallel(0x4004ee0540)
	/home/jenkins/workspace/Build_Cross/test/integration/helpers_test.go:501 +0x5c
k8s.io/minikube/test/integration.TestNetworkPlugins.func1.1(0x4004ee0540)
	/home/jenkins/workspace/Build_Cross/test/integration/net_test.go:106 +0x2c0
testing.tRunner(0x4004ee0540, 0x40015e0c80)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 3581
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 7545 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x37831e0, {{0x37776d0, 0x4000222040?}, 0x400022ad80?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 7532
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 7613 [select]:
os/exec.(*Cmd).watchCtx(0x400154d080, 0x4004e7dd50)
	/usr/local/go/src/os/exec/exec.go:789 +0x70
created by os/exec.(*Cmd).Start in goroutine 7610
	/usr/local/go/src/os/exec/exec.go:775 +0x678

                                                
                                                
goroutine 1134 [chan send, 107 minutes]:
os/exec.(*Cmd).watchCtx(0x4001a53800, 0x4001a749a0)
	/usr/local/go/src/os/exec/exec.go:814 +0x280
created by os/exec.(*Cmd).Start in goroutine 1132
	/usr/local/go/src/os/exec/exec.go:775 +0x678

                                                
                                                
goroutine 7546 [chan receive]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x400184e300, 0x40001061c0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 7532
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 5046 [select, 5 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 5045
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 3652 [chan receive, 36 minutes]:
testing.(*testState).waitParallel(0x40005965f0)
	/usr/local/go/src/testing/testing.go:2116 +0x158
testing.(*T).Parallel(0x40013a9dc0)
	/usr/local/go/src/testing/testing.go:1709 +0x19c
k8s.io/minikube/test/integration.MaybeParallel(0x40013a9dc0)
	/home/jenkins/workspace/Build_Cross/test/integration/helpers_test.go:501 +0x5c
k8s.io/minikube/test/integration.TestNetworkPlugins.func1.1(0x40013a9dc0)
	/home/jenkins/workspace/Build_Cross/test/integration/net_test.go:106 +0x2c0
testing.tRunner(0x40013a9dc0, 0x400040ef00)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 3581
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 3500 [chan receive, 38 minutes]:
testing.(*testState).waitParallel(0x40005965f0)
	/usr/local/go/src/testing/testing.go:2116 +0x158
testing.(*T).Parallel(0x40017eb340)
	/usr/local/go/src/testing/testing.go:1709 +0x19c
k8s.io/minikube/test/integration.MaybeParallel(0x40017eb340)
	/home/jenkins/workspace/Build_Cross/test/integration/helpers_test.go:501 +0x5c
k8s.io/minikube/test/integration.TestStartStop.func1.1(0x40017eb340)
	/home/jenkins/workspace/Build_Cross/test/integration/start_stop_delete_test.go:92 +0x34
testing.tRunner(0x40017eb340, 0x4004e8a140)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 3498
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 5045 [select, 5 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x3769e10, 0x40001061c0}, 0x40012bd740, 0x400138ef88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x3769e10, 0x40001061c0}, 0xc8?, 0x40012bd740, 0x40012bd788)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x3769e10?, 0x40001061c0?}, 0x400022b800?, 0x4000712140?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x400022b800?, 0x95c64?, 0x400149a000?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 5041
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 7482 [chan receive]:
testing.(*T).Run(0x4001b36fc0, {0x29f505c?, 0x40000006ee?}, 0x40015e1800)
	/usr/local/go/src/testing/testing.go:2005 +0x378
k8s.io/minikube/test/integration.TestStartStop.func1.1.1(0x4001b36fc0)
	/home/jenkins/workspace/Build_Cross/test/integration/start_stop_delete_test.go:153 +0x1b8
testing.tRunner(0x4001b36fc0, 0x400156e100)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 3501
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 7611 [IO wait]:
internal/poll.runtime_pollWait(0xffff4c0d4a00, 0x72)
	/usr/local/go/src/runtime/netpoll.go:351 +0xa0
internal/poll.(*pollDesc).wait(0x4001bc7c20?, 0x4001439a47?, 0x1)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:84 +0x28
internal/poll.(*pollDesc).waitRead(...)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Read(0x4001bc7c20, {0x4001439a47, 0x5b9, 0x5b9})
	/usr/local/go/src/internal/poll/fd_unix.go:165 +0x1e0
os.(*File).read(...)
	/usr/local/go/src/os/file_posix.go:29
os.(*File).Read(0x40001136c0, {0x4001439a47?, 0x40012d5568?, 0x8b27c?})
	/usr/local/go/src/os/file.go:144 +0x68
bytes.(*Buffer).ReadFrom(0x40018aa030, {0x3720088, 0x40000a8b08})
	/usr/local/go/src/bytes/buffer.go:217 +0x90
io.copyBuffer({0x3720280, 0x40018aa030}, {0x3720088, 0x40000a8b08}, {0x0, 0x0, 0x0})
	/usr/local/go/src/io/io.go:415 +0x14c
io.Copy(...)
	/usr/local/go/src/io/io.go:388
os.genericWriteTo(0x40001136c0?, {0x3720280, 0x40018aa030})
	/usr/local/go/src/os/file.go:295 +0x58
os.(*File).WriteTo(0x40001136c0, {0x3720280, 0x40018aa030})
	/usr/local/go/src/os/file.go:273 +0x9c
io.copyBuffer({0x3720280, 0x40018aa030}, {0x3720108, 0x40001136c0}, {0x0, 0x0, 0x0})
	/usr/local/go/src/io/io.go:411 +0x98
io.Copy(...)
	/usr/local/go/src/io/io.go:388
os/exec.(*Cmd).writerDescriptor.func1()
	/usr/local/go/src/os/exec/exec.go:596 +0x40
os/exec.(*Cmd).Start.func2(0x400152f6c0?)
	/usr/local/go/src/os/exec/exec.go:749 +0x30
created by os/exec.(*Cmd).Start in goroutine 7610
	/usr/local/go/src/os/exec/exec.go:748 +0x6a4

                                                
                                                
goroutine 2052 [chan send, 76 minutes]:
os/exec.(*Cmd).watchCtx(0x400149a780, 0x4004e7ca80)
	/usr/local/go/src/os/exec/exec.go:814 +0x280
created by os/exec.(*Cmd).Start in goroutine 1516
	/usr/local/go/src/os/exec/exec.go:775 +0x678

                                                
                                                
goroutine 1603 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 1602
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 1597 [chan receive, 78 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x400167a0c0, 0x40001061c0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 1595
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 5041 [chan receive, 14 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x400177ea20, 0x40001061c0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 5023
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 3816 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x3769e10, 0x40001061c0}, 0x40013ddf40, 0x40013e7f88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x3769e10, 0x40001061c0}, 0x38?, 0x40013ddf40, 0x40013ddf88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x3769e10?, 0x40001061c0?}, 0x400013d450?, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x0?, 0x95c64?, 0x4001ca4600?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.34.3/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 3835
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 1596 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x37831e0, {{0x37776d0, 0x4000222040?}, 0x4001c93500?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 1595
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 3705 [chan receive, 36 minutes]:
testing.(*testState).waitParallel(0x40005965f0)
	/usr/local/go/src/testing/testing.go:2116 +0x158
testing.(*T).Parallel(0x4004ee0380)
	/usr/local/go/src/testing/testing.go:1709 +0x19c
k8s.io/minikube/test/integration.MaybeParallel(0x4004ee0380)
	/home/jenkins/workspace/Build_Cross/test/integration/helpers_test.go:501 +0x5c
k8s.io/minikube/test/integration.TestNetworkPlugins.func1.1(0x4004ee0380)
	/home/jenkins/workspace/Build_Cross/test/integration/net_test.go:106 +0x2c0
testing.tRunner(0x4004ee0380, 0x40015e0c00)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 3581
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 3650 [chan receive, 36 minutes]:
testing.(*testState).waitParallel(0x40005965f0)
	/usr/local/go/src/testing/testing.go:2116 +0x158
testing.(*T).Parallel(0x4001c93dc0)
	/usr/local/go/src/testing/testing.go:1709 +0x19c
k8s.io/minikube/test/integration.MaybeParallel(0x4001c93dc0)
	/home/jenkins/workspace/Build_Cross/test/integration/helpers_test.go:501 +0x5c
k8s.io/minikube/test/integration.TestNetworkPlugins.func1.1(0x4001c93dc0)
	/home/jenkins/workspace/Build_Cross/test/integration/net_test.go:106 +0x2c0
testing.tRunner(0x4001c93dc0, 0x400040ee00)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 3581
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 3501 [chan receive]:
testing.(*T).Run(0x40017eb6c0, {0x29e9311?, 0x0?}, 0x400156e100)
	/usr/local/go/src/testing/testing.go:2005 +0x378
k8s.io/minikube/test/integration.TestStartStop.func1.1(0x40017eb6c0)
	/home/jenkins/workspace/Build_Cross/test/integration/start_stop_delete_test.go:128 +0x7e4
testing.tRunner(0x40017eb6c0, 0x4004e8a180)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 3498
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 3504 [chan receive, 14 minutes]:
testing.(*T).Run(0x4001c92a80, {0x29e9311?, 0x0?}, 0x40015e0700)
	/usr/local/go/src/testing/testing.go:2005 +0x378
k8s.io/minikube/test/integration.TestStartStop.func1.1(0x4001c92a80)
	/home/jenkins/workspace/Build_Cross/test/integration/start_stop_delete_test.go:128 +0x7e4
testing.tRunner(0x4001c92a80, 0x4004e8a340)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 3498
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 5024 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x37831e0, {{0x37776d0, 0x4000222040?}, 0x40006f2380?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 5023
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.34.1/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                    

Test pass (260/321)

Order passed test Duration
3 TestDownloadOnly/v1.28.0/json-events 6.34
4 TestDownloadOnly/v1.28.0/preload-exists 0
8 TestDownloadOnly/v1.28.0/LogsDuration 0.1
9 TestDownloadOnly/v1.28.0/DeleteAll 0.22
10 TestDownloadOnly/v1.28.0/DeleteAlwaysSucceeds 0.15
12 TestDownloadOnly/v1.34.3/json-events 4.12
13 TestDownloadOnly/v1.34.3/preload-exists 0
17 TestDownloadOnly/v1.34.3/LogsDuration 0.27
18 TestDownloadOnly/v1.34.3/DeleteAll 0.39
19 TestDownloadOnly/v1.34.3/DeleteAlwaysSucceeds 0.23
21 TestDownloadOnly/v1.35.0-rc.1/json-events 4.2
22 TestDownloadOnly/v1.35.0-rc.1/preload-exists 0
26 TestDownloadOnly/v1.35.0-rc.1/LogsDuration 0.09
27 TestDownloadOnly/v1.35.0-rc.1/DeleteAll 0.22
28 TestDownloadOnly/v1.35.0-rc.1/DeleteAlwaysSucceeds 0.14
30 TestBinaryMirror 0.62
34 TestAddons/PreSetup/EnablingAddonOnNonExistingCluster 0.07
35 TestAddons/PreSetup/DisablingAddonOnNonExistingCluster 0.08
36 TestAddons/Setup 124.59
38 TestAddons/serial/Volcano 41.75
40 TestAddons/serial/GCPAuth/Namespaces 0.16
41 TestAddons/serial/GCPAuth/FakeCredentials 9.82
44 TestAddons/parallel/Registry 16.64
45 TestAddons/parallel/RegistryCreds 0.77
46 TestAddons/parallel/Ingress 18.58
47 TestAddons/parallel/InspektorGadget 11.86
48 TestAddons/parallel/MetricsServer 6.96
50 TestAddons/parallel/CSI 49.33
51 TestAddons/parallel/Headlamp 17.33
52 TestAddons/parallel/CloudSpanner 5.62
53 TestAddons/parallel/LocalPath 51.05
54 TestAddons/parallel/NvidiaDevicePlugin 5.64
55 TestAddons/parallel/Yakd 11.9
57 TestAddons/StoppedEnableDisable 12.4
58 TestCertOptions 41.18
59 TestCertExpiration 227.48
61 TestForceSystemdFlag 33.34
62 TestForceSystemdEnv 42.27
63 TestDockerEnvContainerd 47.72
67 TestErrorSpam/setup 32.89
68 TestErrorSpam/start 0.77
69 TestErrorSpam/status 1.11
70 TestErrorSpam/pause 1.84
71 TestErrorSpam/unpause 1.89
72 TestErrorSpam/stop 1.62
75 TestFunctional/serial/CopySyncFile 0
76 TestFunctional/serial/StartWithProxy 51.77
77 TestFunctional/serial/AuditLog 0
78 TestFunctional/serial/SoftStart 7.39
79 TestFunctional/serial/KubeContext 0.06
80 TestFunctional/serial/KubectlGetPods 0.09
83 TestFunctional/serial/CacheCmd/cache/add_remote 3.54
84 TestFunctional/serial/CacheCmd/cache/add_local 1.24
85 TestFunctional/serial/CacheCmd/cache/CacheDelete 0.05
86 TestFunctional/serial/CacheCmd/cache/list 0.05
87 TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node 0.3
88 TestFunctional/serial/CacheCmd/cache/cache_reload 1.92
89 TestFunctional/serial/CacheCmd/cache/delete 0.11
90 TestFunctional/serial/MinikubeKubectlCmd 0.14
91 TestFunctional/serial/MinikubeKubectlCmdDirectly 0.13
92 TestFunctional/serial/ExtraConfig 296.24
93 TestFunctional/serial/ComponentHealth 0.13
94 TestFunctional/serial/LogsCmd 1.63
95 TestFunctional/serial/LogsFileCmd 1.66
96 TestFunctional/serial/InvalidService 4.84
98 TestFunctional/parallel/ConfigCmd 0.47
100 TestFunctional/parallel/DryRun 0.59
101 TestFunctional/parallel/InternationalLanguage 0.5
102 TestFunctional/parallel/StatusCmd 1.33
106 TestFunctional/parallel/ServiceCmdConnect 9.61
107 TestFunctional/parallel/AddonsCmd 0.14
108 TestFunctional/parallel/PersistentVolumeClaim 18.86
110 TestFunctional/parallel/SSHCmd 0.72
111 TestFunctional/parallel/CpCmd 2.51
113 TestFunctional/parallel/FileSync 0.39
114 TestFunctional/parallel/CertSync 2.2
118 TestFunctional/parallel/NodeLabels 0.09
120 TestFunctional/parallel/NonActiveRuntimeDisabled 0.72
122 TestFunctional/parallel/License 0.32
124 TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel 0.74
125 TestFunctional/parallel/TunnelCmd/serial/StartTunnel 0
127 TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup 8.54
128 TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP 0.08
129 TestFunctional/parallel/TunnelCmd/serial/AccessDirect 0
133 TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel 0.11
134 TestFunctional/parallel/ServiceCmd/DeployApp 8.23
135 TestFunctional/parallel/ProfileCmd/profile_not_create 0.45
136 TestFunctional/parallel/ProfileCmd/profile_list 0.48
137 TestFunctional/parallel/ProfileCmd/profile_json_output 0.44
138 TestFunctional/parallel/MountCmd/any-port 8.47
139 TestFunctional/parallel/ServiceCmd/List 0.56
140 TestFunctional/parallel/ServiceCmd/JSONOutput 0.52
141 TestFunctional/parallel/ServiceCmd/HTTPS 0.51
142 TestFunctional/parallel/ServiceCmd/Format 0.46
143 TestFunctional/parallel/ServiceCmd/URL 0.5
144 TestFunctional/parallel/MountCmd/specific-port 2.46
145 TestFunctional/parallel/MountCmd/VerifyCleanup 1.59
146 TestFunctional/parallel/Version/short 0.07
147 TestFunctional/parallel/Version/components 1.41
148 TestFunctional/parallel/ImageCommands/ImageListShort 0.26
149 TestFunctional/parallel/ImageCommands/ImageListTable 0.29
150 TestFunctional/parallel/ImageCommands/ImageListJson 0.29
151 TestFunctional/parallel/ImageCommands/ImageListYaml 0.31
152 TestFunctional/parallel/ImageCommands/ImageBuild 4.63
153 TestFunctional/parallel/ImageCommands/Setup 0.64
154 TestFunctional/parallel/ImageCommands/ImageLoadDaemon 1.12
155 TestFunctional/parallel/ImageCommands/ImageReloadDaemon 1.25
156 TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon 1.85
157 TestFunctional/parallel/ImageCommands/ImageSaveToFile 0.57
158 TestFunctional/parallel/ImageCommands/ImageRemove 0.65
159 TestFunctional/parallel/ImageCommands/ImageLoadFromFile 0.8
160 TestFunctional/parallel/ImageCommands/ImageSaveDaemon 0.54
161 TestFunctional/parallel/UpdateContextCmd/no_changes 0.19
162 TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster 0.15
163 TestFunctional/parallel/UpdateContextCmd/no_clusters 0.18
164 TestFunctional/delete_echo-server_images 0.04
165 TestFunctional/delete_my-image_image 0.02
166 TestFunctional/delete_minikube_cached_images 0.02
170 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CopySyncFile 0
172 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/AuditLog 0
174 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/KubeContext 0.06
178 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/add_remote 3.35
179 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/add_local 1.04
180 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/CacheDelete 0.06
181 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/list 0.07
182 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/verify_cache_inside_node 0.32
183 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/cache_reload 1.93
184 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/delete 0.13
189 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/LogsCmd 0.95
190 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/LogsFileCmd 0.97
193 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ConfigCmd 0.45
195 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DryRun 0.46
196 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/InternationalLanguage 0.18
202 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/AddonsCmd 0.16
205 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/SSHCmd 0.73
206 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/CpCmd 2.21
208 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/FileSync 0.27
209 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/CertSync 1.71
215 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NonActiveRuntimeDisabled 0.58
217 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/License 0.24
220 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/StartTunnel 0
227 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/DeleteTunnel 0.1
234 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ProfileCmd/profile_not_create 0.43
235 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ProfileCmd/profile_list 0.4
236 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ProfileCmd/profile_json_output 0.41
238 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MountCmd/specific-port 2.07
239 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MountCmd/VerifyCleanup 2.13
240 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/Version/short 0.05
241 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/Version/components 0.54
242 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListShort 0.25
243 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListTable 0.23
244 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListJson 0.25
245 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListYaml 0.25
246 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageBuild 3.44
247 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/Setup 0.24
248 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageLoadDaemon 1.14
249 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageReloadDaemon 1.1
250 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageTagAndLoadDaemon 1.34
251 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageSaveToFile 0.34
252 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageRemove 0.47
253 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageLoadFromFile 0.67
254 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageSaveDaemon 0.4
255 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_changes 0.15
256 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_minikube_cluster 0.19
257 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_clusters 0.15
258 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/delete_echo-server_images 0.04
259 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/delete_my-image_image 0.02
260 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/delete_minikube_cached_images 0.02
264 TestMultiControlPlane/serial/StartCluster 137.71
265 TestMultiControlPlane/serial/DeployApp 7.95
266 TestMultiControlPlane/serial/PingHostFromPods 1.68
267 TestMultiControlPlane/serial/AddWorkerNode 31.02
268 TestMultiControlPlane/serial/NodeLabels 0.12
269 TestMultiControlPlane/serial/HAppyAfterClusterStart 1.14
270 TestMultiControlPlane/serial/CopyFile 20.67
271 TestMultiControlPlane/serial/StopSecondaryNode 13.25
272 TestMultiControlPlane/serial/DegradedAfterControlPlaneNodeStop 0.84
273 TestMultiControlPlane/serial/RestartSecondaryNode 14.37
274 TestMultiControlPlane/serial/HAppyAfterSecondaryNodeRestart 1.09
275 TestMultiControlPlane/serial/RestartClusterKeepsNodes 98.87
276 TestMultiControlPlane/serial/DeleteSecondaryNode 11.25
277 TestMultiControlPlane/serial/DegradedAfterSecondaryNodeDelete 0.81
278 TestMultiControlPlane/serial/StopCluster 36.4
279 TestMultiControlPlane/serial/RestartCluster 60.38
280 TestMultiControlPlane/serial/DegradedAfterClusterRestart 0.87
281 TestMultiControlPlane/serial/AddSecondaryNode 61.39
282 TestMultiControlPlane/serial/HAppyAfterSecondaryNodeAdd 1.1
287 TestJSONOutput/start/Command 55.03
288 TestJSONOutput/start/Audit 0
290 TestJSONOutput/start/parallel/DistinctCurrentSteps 0
291 TestJSONOutput/start/parallel/IncreasingCurrentSteps 0
293 TestJSONOutput/pause/Command 0.73
294 TestJSONOutput/pause/Audit 0
296 TestJSONOutput/pause/parallel/DistinctCurrentSteps 0
297 TestJSONOutput/pause/parallel/IncreasingCurrentSteps 0
299 TestJSONOutput/unpause/Command 0.66
300 TestJSONOutput/unpause/Audit 0
302 TestJSONOutput/unpause/parallel/DistinctCurrentSteps 0
303 TestJSONOutput/unpause/parallel/IncreasingCurrentSteps 0
305 TestJSONOutput/stop/Command 5.99
306 TestJSONOutput/stop/Audit 0
308 TestJSONOutput/stop/parallel/DistinctCurrentSteps 0
309 TestJSONOutput/stop/parallel/IncreasingCurrentSteps 0
310 TestErrorJSONOutput 0.24
312 TestKicCustomNetwork/create_custom_network 36.64
313 TestKicCustomNetwork/use_default_bridge_network 37.34
314 TestKicExistingNetwork 34.36
315 TestKicCustomSubnet 34.85
316 TestKicStaticIP 35.58
317 TestMainNoArgs 0.06
318 TestMinikubeProfile 75.09
321 TestMountStart/serial/StartWithMountFirst 8.59
322 TestMountStart/serial/VerifyMountFirst 0.28
323 TestMountStart/serial/StartWithMountSecond 6.06
324 TestMountStart/serial/VerifyMountSecond 0.27
325 TestMountStart/serial/DeleteFirst 1.77
326 TestMountStart/serial/VerifyMountPostDelete 0.28
327 TestMountStart/serial/Stop 1.3
328 TestMountStart/serial/RestartStopped 7.69
329 TestMountStart/serial/VerifyMountPostStop 0.28
332 TestMultiNode/serial/FreshStart2Nodes 80.89
333 TestMultiNode/serial/DeployApp2Nodes 7.07
334 TestMultiNode/serial/PingHostFrom2Pods 1.04
335 TestMultiNode/serial/AddNode 28.83
336 TestMultiNode/serial/MultiNodeLabels 0.09
337 TestMultiNode/serial/ProfileList 0.73
338 TestMultiNode/serial/CopyFile 11.13
339 TestMultiNode/serial/StopNode 2.44
340 TestMultiNode/serial/StartAfterStop 7.89
341 TestMultiNode/serial/RestartKeepsNodes 79.16
342 TestMultiNode/serial/DeleteNode 5.73
343 TestMultiNode/serial/StopMultiNode 24.09
344 TestMultiNode/serial/RestartMultiNode 50.71
345 TestMultiNode/serial/ValidateNameConflict 33.74
350 TestPreload 127.42
352 TestScheduledStopUnix 109.68
355 TestInsufficientStorage 9.79
356 TestRunningBinaryUpgrade 311.31
359 TestMissingContainerUpgrade 134.91
361 TestNoKubernetes/serial/StartNoK8sWithVersion 0.1
362 TestNoKubernetes/serial/StartWithK8s 47.08
363 TestNoKubernetes/serial/StartWithStopK8s 24.41
364 TestNoKubernetes/serial/Start 8.12
365 TestNoKubernetes/serial/VerifyNok8sNoK8sDownloads 0
366 TestNoKubernetes/serial/VerifyK8sNotRunning 0.29
367 TestNoKubernetes/serial/ProfileList 0.75
368 TestNoKubernetes/serial/Stop 1.34
369 TestNoKubernetes/serial/StartNoArgs 7.47
370 TestNoKubernetes/serial/VerifyK8sNotRunningSecond 0.27
371 TestStoppedBinaryUpgrade/Setup 0.89
372 TestStoppedBinaryUpgrade/Upgrade 304.67
373 TestStoppedBinaryUpgrade/MinikubeLogs 2.64
382 TestPause/serial/Start 52.51
383 TestPause/serial/SecondStartNoReconfiguration 6.24
384 TestPause/serial/Pause 0.73
385 TestPause/serial/VerifyStatus 0.33
386 TestPause/serial/Unpause 0.64
387 TestPause/serial/PauseAgain 0.89
388 TestPause/serial/DeletePaused 2.79
389 TestPause/serial/VerifyDeletedResources 0.42
x
+
TestDownloadOnly/v1.28.0/json-events (6.34s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.0/json-events
aaa_download_only_test.go:80: (dbg) Run:  out/minikube-linux-arm64 start -o=json --download-only -p download-only-122467 --force --alsologtostderr --kubernetes-version=v1.28.0 --container-runtime=containerd --driver=docker  --container-runtime=containerd
aaa_download_only_test.go:80: (dbg) Done: out/minikube-linux-arm64 start -o=json --download-only -p download-only-122467 --force --alsologtostderr --kubernetes-version=v1.28.0 --container-runtime=containerd --driver=docker  --container-runtime=containerd: (6.337574633s)
--- PASS: TestDownloadOnly/v1.28.0/json-events (6.34s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.0/preload-exists (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.0/preload-exists
I1219 05:43:11.056211 2000386 preload.go:188] Checking if preload exists for k8s version v1.28.0 and runtime containerd
I1219 05:43:11.056299 2000386 preload.go:203] Found local preload: /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.0-containerd-overlay2-arm64.tar.lz4
--- PASS: TestDownloadOnly/v1.28.0/preload-exists (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.0/LogsDuration (0.1s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.0/LogsDuration
aaa_download_only_test.go:183: (dbg) Run:  out/minikube-linux-arm64 logs -p download-only-122467
aaa_download_only_test.go:183: (dbg) Non-zero exit: out/minikube-linux-arm64 logs -p download-only-122467: exit status 85 (101.366132ms)

                                                
                                                
-- stdout --
	
	==> Audit <==
	┌─────────┬───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬──────────────────────┬─────────┬─────────┬─────────────────────┬──────────┐
	│ COMMAND │                                                                                         ARGS                                                                                          │       PROFILE        │  USER   │ VERSION │     START TIME      │ END TIME │
	├─────────┼───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼──────────────────────┼─────────┼─────────┼─────────────────────┼──────────┤
	│ start   │ -o=json --download-only -p download-only-122467 --force --alsologtostderr --kubernetes-version=v1.28.0 --container-runtime=containerd --driver=docker  --container-runtime=containerd │ download-only-122467 │ jenkins │ v1.37.0 │ 19 Dec 25 05:43 UTC │          │
	└─────────┴───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴──────────────────────┴─────────┴─────────┴─────────────────────┴──────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/19 05:43:04
	Running on machine: ip-172-31-30-239
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1219 05:43:04.759054 2000391 out.go:360] Setting OutFile to fd 1 ...
	I1219 05:43:04.759186 2000391 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 05:43:04.759196 2000391 out.go:374] Setting ErrFile to fd 2...
	I1219 05:43:04.759201 2000391 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 05:43:04.759469 2000391 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22230-1998525/.minikube/bin
	W1219 05:43:04.759603 2000391 root.go:314] Error reading config file at /home/jenkins/minikube-integration/22230-1998525/.minikube/config/config.json: open /home/jenkins/minikube-integration/22230-1998525/.minikube/config/config.json: no such file or directory
	I1219 05:43:04.759997 2000391 out.go:368] Setting JSON to true
	I1219 05:43:04.760855 2000391 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":37531,"bootTime":1766085454,"procs":153,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1219 05:43:04.760921 2000391 start.go:143] virtualization:  
	I1219 05:43:04.766598 2000391 out.go:99] [download-only-122467] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	W1219 05:43:04.766793 2000391 preload.go:354] Failed to list preload files: open /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/preloaded-tarball: no such file or directory
	I1219 05:43:04.766908 2000391 notify.go:221] Checking for updates...
	I1219 05:43:04.770419 2000391 out.go:171] MINIKUBE_LOCATION=22230
	I1219 05:43:04.773945 2000391 out.go:171] MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1219 05:43:04.777210 2000391 out.go:171] KUBECONFIG=/home/jenkins/minikube-integration/22230-1998525/kubeconfig
	I1219 05:43:04.780414 2000391 out.go:171] MINIKUBE_HOME=/home/jenkins/minikube-integration/22230-1998525/.minikube
	I1219 05:43:04.783476 2000391 out.go:171] MINIKUBE_BIN=out/minikube-linux-arm64
	W1219 05:43:04.789484 2000391 out.go:336] minikube skips various validations when --force is supplied; this may lead to unexpected behavior
	I1219 05:43:04.789793 2000391 driver.go:422] Setting default libvirt URI to qemu:///system
	I1219 05:43:04.810214 2000391 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1219 05:43:04.810331 2000391 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1219 05:43:04.868699 2000391 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:1 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:29 OomKillDisable:true NGoroutines:61 SystemTime:2025-12-19 05:43:04.859483692 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1219 05:43:04.868845 2000391 docker.go:319] overlay module found
	I1219 05:43:04.871927 2000391 out.go:99] Using the docker driver based on user configuration
	I1219 05:43:04.871975 2000391 start.go:309] selected driver: docker
	I1219 05:43:04.871982 2000391 start.go:928] validating driver "docker" against <nil>
	I1219 05:43:04.872098 2000391 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1219 05:43:04.924445 2000391 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:1 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:29 OomKillDisable:true NGoroutines:61 SystemTime:2025-12-19 05:43:04.9155323 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:aar
ch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pat
h:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1219 05:43:04.924610 2000391 start_flags.go:327] no existing cluster config was found, will generate one from the flags 
	I1219 05:43:04.925003 2000391 start_flags.go:411] Using suggested 3072MB memory alloc based on sys=7834MB, container=7834MB
	I1219 05:43:04.925166 2000391 start_flags.go:975] Wait components to verify : map[apiserver:true system_pods:true]
	I1219 05:43:04.928334 2000391 out.go:171] Using Docker driver with root privileges
	I1219 05:43:04.931432 2000391 cni.go:84] Creating CNI manager for ""
	I1219 05:43:04.931505 2000391 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1219 05:43:04.931517 2000391 start_flags.go:336] Found "CNI" CNI - setting NetworkPlugin=cni
	I1219 05:43:04.931601 2000391 start.go:353] cluster config:
	{Name:download-only-122467 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.28.0 ClusterName:download-only-122467 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Co
ntainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.28.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1219 05:43:04.934644 2000391 out.go:99] Starting "download-only-122467" primary control-plane node in "download-only-122467" cluster
	I1219 05:43:04.934677 2000391 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1219 05:43:04.937507 2000391 out.go:99] Pulling base image v0.0.48-1765966054-22186 ...
	I1219 05:43:04.937561 2000391 preload.go:188] Checking if preload exists for k8s version v1.28.0 and runtime containerd
	I1219 05:43:04.937641 2000391 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local docker daemon
	I1219 05:43:04.956313 2000391 cache.go:163] Downloading gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 to local cache
	I1219 05:43:04.956509 2000391 image.go:65] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 in local cache directory
	I1219 05:43:04.956611 2000391 image.go:150] Writing gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 to local cache
	I1219 05:43:04.987838 2000391 preload.go:148] Found remote preload: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.28.0/preloaded-images-k8s-v18-v1.28.0-containerd-overlay2-arm64.tar.lz4
	I1219 05:43:04.987876 2000391 cache.go:65] Caching tarball of preloaded images
	I1219 05:43:04.988047 2000391 preload.go:188] Checking if preload exists for k8s version v1.28.0 and runtime containerd
	I1219 05:43:04.991508 2000391 out.go:99] Downloading Kubernetes v1.28.0 preload ...
	I1219 05:43:04.991532 2000391 preload.go:318] getting checksum for preloaded-images-k8s-v18-v1.28.0-containerd-overlay2-arm64.tar.lz4 from gcs api...
	I1219 05:43:05.071803 2000391 preload.go:295] Got checksum from GCS API "38d7f581f2fa4226c8af2c9106b982b7"
	I1219 05:43:05.071942 2000391 download.go:108] Downloading: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.28.0/preloaded-images-k8s-v18-v1.28.0-containerd-overlay2-arm64.tar.lz4?checksum=md5:38d7f581f2fa4226c8af2c9106b982b7 -> /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.0-containerd-overlay2-arm64.tar.lz4
	I1219 05:43:09.482868 2000391 cache.go:68] Finished verifying existence of preloaded tar for v1.28.0 on containerd
	I1219 05:43:09.483273 2000391 profile.go:143] Saving config to /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/download-only-122467/config.json ...
	I1219 05:43:09.483307 2000391 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/download-only-122467/config.json: {Name:mk65b7b205e58d94e24b07fcea6fbb400ab1e853 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1219 05:43:09.483491 2000391 preload.go:188] Checking if preload exists for k8s version v1.28.0 and runtime containerd
	I1219 05:43:09.483684 2000391 download.go:108] Downloading: https://dl.k8s.io/release/v1.28.0/bin/linux/arm64/kubectl?checksum=file:https://dl.k8s.io/release/v1.28.0/bin/linux/arm64/kubectl.sha256 -> /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/linux/arm64/v1.28.0/kubectl
	
	
	* The control-plane node download-only-122467 host does not exist
	  To start a cluster, run: "minikube start -p download-only-122467"

                                                
                                                
-- /stdout --
aaa_download_only_test.go:184: minikube logs failed with error: exit status 85
--- PASS: TestDownloadOnly/v1.28.0/LogsDuration (0.10s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.0/DeleteAll (0.22s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.0/DeleteAll
aaa_download_only_test.go:196: (dbg) Run:  out/minikube-linux-arm64 delete --all
--- PASS: TestDownloadOnly/v1.28.0/DeleteAll (0.22s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.0/DeleteAlwaysSucceeds (0.15s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.0/DeleteAlwaysSucceeds
aaa_download_only_test.go:207: (dbg) Run:  out/minikube-linux-arm64 delete -p download-only-122467
--- PASS: TestDownloadOnly/v1.28.0/DeleteAlwaysSucceeds (0.15s)

                                                
                                    
x
+
TestDownloadOnly/v1.34.3/json-events (4.12s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.34.3/json-events
aaa_download_only_test.go:80: (dbg) Run:  out/minikube-linux-arm64 start -o=json --download-only -p download-only-336768 --force --alsologtostderr --kubernetes-version=v1.34.3 --container-runtime=containerd --driver=docker  --container-runtime=containerd
aaa_download_only_test.go:80: (dbg) Done: out/minikube-linux-arm64 start -o=json --download-only -p download-only-336768 --force --alsologtostderr --kubernetes-version=v1.34.3 --container-runtime=containerd --driver=docker  --container-runtime=containerd: (4.115026784s)
--- PASS: TestDownloadOnly/v1.34.3/json-events (4.12s)

                                                
                                    
x
+
TestDownloadOnly/v1.34.3/preload-exists (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.34.3/preload-exists
I1219 05:43:15.644780 2000386 preload.go:188] Checking if preload exists for k8s version v1.34.3 and runtime containerd
I1219 05:43:15.644819 2000386 preload.go:203] Found local preload: /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.3-containerd-overlay2-arm64.tar.lz4
--- PASS: TestDownloadOnly/v1.34.3/preload-exists (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.34.3/LogsDuration (0.27s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.34.3/LogsDuration
aaa_download_only_test.go:183: (dbg) Run:  out/minikube-linux-arm64 logs -p download-only-336768
aaa_download_only_test.go:183: (dbg) Non-zero exit: out/minikube-linux-arm64 logs -p download-only-336768: exit status 85 (268.374268ms)

                                                
                                                
-- stdout --
	
	==> Audit <==
	┌─────────┬───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬──────────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                                         ARGS                                                                                          │       PROFILE        │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼──────────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ start   │ -o=json --download-only -p download-only-122467 --force --alsologtostderr --kubernetes-version=v1.28.0 --container-runtime=containerd --driver=docker  --container-runtime=containerd │ download-only-122467 │ jenkins │ v1.37.0 │ 19 Dec 25 05:43 UTC │                     │
	│ delete  │ --all                                                                                                                                                                                 │ minikube             │ jenkins │ v1.37.0 │ 19 Dec 25 05:43 UTC │ 19 Dec 25 05:43 UTC │
	│ delete  │ -p download-only-122467                                                                                                                                                               │ download-only-122467 │ jenkins │ v1.37.0 │ 19 Dec 25 05:43 UTC │ 19 Dec 25 05:43 UTC │
	│ start   │ -o=json --download-only -p download-only-336768 --force --alsologtostderr --kubernetes-version=v1.34.3 --container-runtime=containerd --driver=docker  --container-runtime=containerd │ download-only-336768 │ jenkins │ v1.37.0 │ 19 Dec 25 05:43 UTC │                     │
	└─────────┴───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴──────────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/19 05:43:11
	Running on machine: ip-172-31-30-239
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1219 05:43:11.574996 2000596 out.go:360] Setting OutFile to fd 1 ...
	I1219 05:43:11.575122 2000596 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 05:43:11.575134 2000596 out.go:374] Setting ErrFile to fd 2...
	I1219 05:43:11.575140 2000596 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 05:43:11.575475 2000596 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22230-1998525/.minikube/bin
	I1219 05:43:11.575941 2000596 out.go:368] Setting JSON to true
	I1219 05:43:11.577037 2000596 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":37538,"bootTime":1766085454,"procs":147,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1219 05:43:11.577165 2000596 start.go:143] virtualization:  
	I1219 05:43:11.580813 2000596 out.go:99] [download-only-336768] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1219 05:43:11.581079 2000596 notify.go:221] Checking for updates...
	I1219 05:43:11.584014 2000596 out.go:171] MINIKUBE_LOCATION=22230
	I1219 05:43:11.587090 2000596 out.go:171] MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1219 05:43:11.590017 2000596 out.go:171] KUBECONFIG=/home/jenkins/minikube-integration/22230-1998525/kubeconfig
	I1219 05:43:11.592989 2000596 out.go:171] MINIKUBE_HOME=/home/jenkins/minikube-integration/22230-1998525/.minikube
	I1219 05:43:11.595924 2000596 out.go:171] MINIKUBE_BIN=out/minikube-linux-arm64
	W1219 05:43:11.601608 2000596 out.go:336] minikube skips various validations when --force is supplied; this may lead to unexpected behavior
	I1219 05:43:11.601886 2000596 driver.go:422] Setting default libvirt URI to qemu:///system
	I1219 05:43:11.632744 2000596 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1219 05:43:11.632881 2000596 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1219 05:43:11.686714 2000596 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:1 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:29 OomKillDisable:true NGoroutines:49 SystemTime:2025-12-19 05:43:11.677401364 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1219 05:43:11.686826 2000596 docker.go:319] overlay module found
	I1219 05:43:11.689880 2000596 out.go:99] Using the docker driver based on user configuration
	I1219 05:43:11.689929 2000596 start.go:309] selected driver: docker
	I1219 05:43:11.689938 2000596 start.go:928] validating driver "docker" against <nil>
	I1219 05:43:11.690063 2000596 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1219 05:43:11.741682 2000596 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:1 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:29 OomKillDisable:true NGoroutines:49 SystemTime:2025-12-19 05:43:11.732883933 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1219 05:43:11.741852 2000596 start_flags.go:327] no existing cluster config was found, will generate one from the flags 
	I1219 05:43:11.742160 2000596 start_flags.go:411] Using suggested 3072MB memory alloc based on sys=7834MB, container=7834MB
	I1219 05:43:11.742317 2000596 start_flags.go:975] Wait components to verify : map[apiserver:true system_pods:true]
	I1219 05:43:11.745471 2000596 out.go:171] Using Docker driver with root privileges
	
	
	* The control-plane node download-only-336768 host does not exist
	  To start a cluster, run: "minikube start -p download-only-336768"

                                                
                                                
-- /stdout --
aaa_download_only_test.go:184: minikube logs failed with error: exit status 85
--- PASS: TestDownloadOnly/v1.34.3/LogsDuration (0.27s)

                                                
                                    
x
+
TestDownloadOnly/v1.34.3/DeleteAll (0.39s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.34.3/DeleteAll
aaa_download_only_test.go:196: (dbg) Run:  out/minikube-linux-arm64 delete --all
--- PASS: TestDownloadOnly/v1.34.3/DeleteAll (0.39s)

                                                
                                    
x
+
TestDownloadOnly/v1.34.3/DeleteAlwaysSucceeds (0.23s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.34.3/DeleteAlwaysSucceeds
aaa_download_only_test.go:207: (dbg) Run:  out/minikube-linux-arm64 delete -p download-only-336768
--- PASS: TestDownloadOnly/v1.34.3/DeleteAlwaysSucceeds (0.23s)

                                                
                                    
x
+
TestDownloadOnly/v1.35.0-rc.1/json-events (4.2s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.35.0-rc.1/json-events
aaa_download_only_test.go:80: (dbg) Run:  out/minikube-linux-arm64 start -o=json --download-only -p download-only-853815 --force --alsologtostderr --kubernetes-version=v1.35.0-rc.1 --container-runtime=containerd --driver=docker  --container-runtime=containerd
aaa_download_only_test.go:80: (dbg) Done: out/minikube-linux-arm64 start -o=json --download-only -p download-only-853815 --force --alsologtostderr --kubernetes-version=v1.35.0-rc.1 --container-runtime=containerd --driver=docker  --container-runtime=containerd: (4.198537003s)
--- PASS: TestDownloadOnly/v1.35.0-rc.1/json-events (4.20s)

                                                
                                    
x
+
TestDownloadOnly/v1.35.0-rc.1/preload-exists (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.35.0-rc.1/preload-exists
I1219 05:43:20.729729 2000386 preload.go:188] Checking if preload exists for k8s version v1.35.0-rc.1 and runtime containerd
I1219 05:43:20.729765 2000386 preload.go:203] Found local preload: /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.35.0-rc.1-containerd-overlay2-arm64.tar.lz4
--- PASS: TestDownloadOnly/v1.35.0-rc.1/preload-exists (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.35.0-rc.1/LogsDuration (0.09s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.35.0-rc.1/LogsDuration
aaa_download_only_test.go:183: (dbg) Run:  out/minikube-linux-arm64 logs -p download-only-853815
aaa_download_only_test.go:183: (dbg) Non-zero exit: out/minikube-linux-arm64 logs -p download-only-853815: exit status 85 (87.42075ms)

                                                
                                                
-- stdout --
	
	==> Audit <==
	┌─────────┬────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬──────────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                                            ARGS                                                                                            │       PROFILE        │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼──────────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ start   │ -o=json --download-only -p download-only-122467 --force --alsologtostderr --kubernetes-version=v1.28.0 --container-runtime=containerd --driver=docker  --container-runtime=containerd      │ download-only-122467 │ jenkins │ v1.37.0 │ 19 Dec 25 05:43 UTC │                     │
	│ delete  │ --all                                                                                                                                                                                      │ minikube             │ jenkins │ v1.37.0 │ 19 Dec 25 05:43 UTC │ 19 Dec 25 05:43 UTC │
	│ delete  │ -p download-only-122467                                                                                                                                                                    │ download-only-122467 │ jenkins │ v1.37.0 │ 19 Dec 25 05:43 UTC │ 19 Dec 25 05:43 UTC │
	│ start   │ -o=json --download-only -p download-only-336768 --force --alsologtostderr --kubernetes-version=v1.34.3 --container-runtime=containerd --driver=docker  --container-runtime=containerd      │ download-only-336768 │ jenkins │ v1.37.0 │ 19 Dec 25 05:43 UTC │                     │
	│ delete  │ --all                                                                                                                                                                                      │ minikube             │ jenkins │ v1.37.0 │ 19 Dec 25 05:43 UTC │ 19 Dec 25 05:43 UTC │
	│ delete  │ -p download-only-336768                                                                                                                                                                    │ download-only-336768 │ jenkins │ v1.37.0 │ 19 Dec 25 05:43 UTC │ 19 Dec 25 05:43 UTC │
	│ start   │ -o=json --download-only -p download-only-853815 --force --alsologtostderr --kubernetes-version=v1.35.0-rc.1 --container-runtime=containerd --driver=docker  --container-runtime=containerd │ download-only-853815 │ jenkins │ v1.37.0 │ 19 Dec 25 05:43 UTC │                     │
	└─────────┴────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴──────────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/19 05:43:16
	Running on machine: ip-172-31-30-239
	Binary: Built with gc go1.25.5 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1219 05:43:16.575332 2000800 out.go:360] Setting OutFile to fd 1 ...
	I1219 05:43:16.575454 2000800 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 05:43:16.575490 2000800 out.go:374] Setting ErrFile to fd 2...
	I1219 05:43:16.575506 2000800 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 05:43:16.575764 2000800 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22230-1998525/.minikube/bin
	I1219 05:43:16.576178 2000800 out.go:368] Setting JSON to true
	I1219 05:43:16.577020 2000800 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":37543,"bootTime":1766085454,"procs":147,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1219 05:43:16.577095 2000800 start.go:143] virtualization:  
	I1219 05:43:16.623289 2000800 out.go:99] [download-only-853815] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1219 05:43:16.623658 2000800 notify.go:221] Checking for updates...
	I1219 05:43:16.669426 2000800 out.go:171] MINIKUBE_LOCATION=22230
	I1219 05:43:16.698728 2000800 out.go:171] MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1219 05:43:16.727541 2000800 out.go:171] KUBECONFIG=/home/jenkins/minikube-integration/22230-1998525/kubeconfig
	I1219 05:43:16.760537 2000800 out.go:171] MINIKUBE_HOME=/home/jenkins/minikube-integration/22230-1998525/.minikube
	I1219 05:43:16.791844 2000800 out.go:171] MINIKUBE_BIN=out/minikube-linux-arm64
	W1219 05:43:16.855937 2000800 out.go:336] minikube skips various validations when --force is supplied; this may lead to unexpected behavior
	I1219 05:43:16.856250 2000800 driver.go:422] Setting default libvirt URI to qemu:///system
	I1219 05:43:16.878238 2000800 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1219 05:43:16.878359 2000800 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1219 05:43:16.934529 2000800 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:1 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:29 OomKillDisable:true NGoroutines:47 SystemTime:2025-12-19 05:43:16.925520406 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1219 05:43:16.934655 2000800 docker.go:319] overlay module found
	I1219 05:43:16.938978 2000800 out.go:99] Using the docker driver based on user configuration
	I1219 05:43:16.939023 2000800 start.go:309] selected driver: docker
	I1219 05:43:16.939032 2000800 start.go:928] validating driver "docker" against <nil>
	I1219 05:43:16.939137 2000800 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1219 05:43:17.009362 2000800 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:1 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:29 OomKillDisable:true NGoroutines:47 SystemTime:2025-12-19 05:43:16.9975148 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:aar
ch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pat
h:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1219 05:43:17.009535 2000800 start_flags.go:327] no existing cluster config was found, will generate one from the flags 
	I1219 05:43:17.009938 2000800 start_flags.go:411] Using suggested 3072MB memory alloc based on sys=7834MB, container=7834MB
	I1219 05:43:17.010161 2000800 start_flags.go:975] Wait components to verify : map[apiserver:true system_pods:true]
	I1219 05:43:17.014933 2000800 out.go:171] Using Docker driver with root privileges
	
	
	* The control-plane node download-only-853815 host does not exist
	  To start a cluster, run: "minikube start -p download-only-853815"

                                                
                                                
-- /stdout --
aaa_download_only_test.go:184: minikube logs failed with error: exit status 85
--- PASS: TestDownloadOnly/v1.35.0-rc.1/LogsDuration (0.09s)

                                                
                                    
x
+
TestDownloadOnly/v1.35.0-rc.1/DeleteAll (0.22s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.35.0-rc.1/DeleteAll
aaa_download_only_test.go:196: (dbg) Run:  out/minikube-linux-arm64 delete --all
--- PASS: TestDownloadOnly/v1.35.0-rc.1/DeleteAll (0.22s)

                                                
                                    
x
+
TestDownloadOnly/v1.35.0-rc.1/DeleteAlwaysSucceeds (0.14s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.35.0-rc.1/DeleteAlwaysSucceeds
aaa_download_only_test.go:207: (dbg) Run:  out/minikube-linux-arm64 delete -p download-only-853815
--- PASS: TestDownloadOnly/v1.35.0-rc.1/DeleteAlwaysSucceeds (0.14s)

                                                
                                    
x
+
TestBinaryMirror (0.62s)

                                                
                                                
=== RUN   TestBinaryMirror
I1219 05:43:22.036260 2000386 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.34.3/bin/linux/arm64/kubectl?checksum=file:https://dl.k8s.io/release/v1.34.3/bin/linux/arm64/kubectl.sha256
aaa_download_only_test.go:309: (dbg) Run:  out/minikube-linux-arm64 start --download-only -p binary-mirror-815660 --alsologtostderr --binary-mirror http://127.0.0.1:38579 --driver=docker  --container-runtime=containerd
helpers_test.go:176: Cleaning up "binary-mirror-815660" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p binary-mirror-815660
--- PASS: TestBinaryMirror (0.62s)

                                                
                                    
x
+
TestAddons/PreSetup/EnablingAddonOnNonExistingCluster (0.07s)

                                                
                                                
=== RUN   TestAddons/PreSetup/EnablingAddonOnNonExistingCluster
=== PAUSE TestAddons/PreSetup/EnablingAddonOnNonExistingCluster

                                                
                                                

                                                
                                                
=== CONT  TestAddons/PreSetup/EnablingAddonOnNonExistingCluster
addons_test.go:1002: (dbg) Run:  out/minikube-linux-arm64 addons enable dashboard -p addons-064622
addons_test.go:1002: (dbg) Non-zero exit: out/minikube-linux-arm64 addons enable dashboard -p addons-064622: exit status 85 (66.765662ms)

                                                
                                                
-- stdout --
	* Profile "addons-064622" not found. Run "minikube profile list" to view all profiles.
	  To start a cluster, run: "minikube start -p addons-064622"

                                                
                                                
-- /stdout --
--- PASS: TestAddons/PreSetup/EnablingAddonOnNonExistingCluster (0.07s)

                                                
                                    
x
+
TestAddons/PreSetup/DisablingAddonOnNonExistingCluster (0.08s)

                                                
                                                
=== RUN   TestAddons/PreSetup/DisablingAddonOnNonExistingCluster
=== PAUSE TestAddons/PreSetup/DisablingAddonOnNonExistingCluster

                                                
                                                

                                                
                                                
=== CONT  TestAddons/PreSetup/DisablingAddonOnNonExistingCluster
addons_test.go:1013: (dbg) Run:  out/minikube-linux-arm64 addons disable dashboard -p addons-064622
addons_test.go:1013: (dbg) Non-zero exit: out/minikube-linux-arm64 addons disable dashboard -p addons-064622: exit status 85 (75.915727ms)

                                                
                                                
-- stdout --
	* Profile "addons-064622" not found. Run "minikube profile list" to view all profiles.
	  To start a cluster, run: "minikube start -p addons-064622"

                                                
                                                
-- /stdout --
--- PASS: TestAddons/PreSetup/DisablingAddonOnNonExistingCluster (0.08s)

                                                
                                    
x
+
TestAddons/Setup (124.59s)

                                                
                                                
=== RUN   TestAddons/Setup
addons_test.go:110: (dbg) Run:  out/minikube-linux-arm64 start -p addons-064622 --wait=true --memory=4096 --alsologtostderr --addons=registry --addons=registry-creds --addons=metrics-server --addons=volumesnapshots --addons=csi-hostpath-driver --addons=gcp-auth --addons=cloud-spanner --addons=inspektor-gadget --addons=nvidia-device-plugin --addons=yakd --addons=volcano --addons=amd-gpu-device-plugin --driver=docker  --container-runtime=containerd --addons=ingress --addons=ingress-dns --addons=storage-provisioner-rancher
addons_test.go:110: (dbg) Done: out/minikube-linux-arm64 start -p addons-064622 --wait=true --memory=4096 --alsologtostderr --addons=registry --addons=registry-creds --addons=metrics-server --addons=volumesnapshots --addons=csi-hostpath-driver --addons=gcp-auth --addons=cloud-spanner --addons=inspektor-gadget --addons=nvidia-device-plugin --addons=yakd --addons=volcano --addons=amd-gpu-device-plugin --driver=docker  --container-runtime=containerd --addons=ingress --addons=ingress-dns --addons=storage-provisioner-rancher: (2m4.594842416s)
--- PASS: TestAddons/Setup (124.59s)

                                                
                                    
x
+
TestAddons/serial/Volcano (41.75s)

                                                
                                                
=== RUN   TestAddons/serial/Volcano
addons_test.go:886: volcano-controller stabilized in 51.879693ms
addons_test.go:878: volcano-admission stabilized in 52.051067ms
addons_test.go:870: volcano-scheduler stabilized in 52.324252ms
addons_test.go:892: (dbg) TestAddons/serial/Volcano: waiting 6m0s for pods matching "app=volcano-scheduler" in namespace "volcano-system" ...
helpers_test.go:353: "volcano-scheduler-76c996c8bf-vhxtp" [a9edbc45-5f57-4bca-b600-87b27de86978] Running
addons_test.go:892: (dbg) TestAddons/serial/Volcano: app=volcano-scheduler healthy within 6.003436616s
addons_test.go:896: (dbg) TestAddons/serial/Volcano: waiting 6m0s for pods matching "app=volcano-admission" in namespace "volcano-system" ...
helpers_test.go:353: "volcano-admission-6c447bd768-286mh" [bf563faa-d884-4ab5-ad81-1804d6eb4c9e] Pending / Ready:ContainersNotReady (containers with unready status: [admission]) / ContainersReady:ContainersNotReady (containers with unready status: [admission])
helpers_test.go:353: "volcano-admission-6c447bd768-286mh" [bf563faa-d884-4ab5-ad81-1804d6eb4c9e] Running
addons_test.go:896: (dbg) TestAddons/serial/Volcano: app=volcano-admission healthy within 7.004428568s
addons_test.go:900: (dbg) TestAddons/serial/Volcano: waiting 6m0s for pods matching "app=volcano-controller" in namespace "volcano-system" ...
helpers_test.go:353: "volcano-controllers-6fd4f85cb8-8hh6q" [14a05674-5182-4647-a406-25839e74f459] Running
addons_test.go:900: (dbg) TestAddons/serial/Volcano: app=volcano-controller healthy within 5.004133734s
addons_test.go:905: (dbg) Run:  kubectl --context addons-064622 delete -n volcano-system job volcano-admission-init
addons_test.go:911: (dbg) Run:  kubectl --context addons-064622 create -f testdata/vcjob.yaml
addons_test.go:919: (dbg) Run:  kubectl --context addons-064622 get vcjob -n my-volcano
addons_test.go:937: (dbg) TestAddons/serial/Volcano: waiting 3m0s for pods matching "volcano.sh/job-name=test-job" in namespace "my-volcano" ...
helpers_test.go:353: "test-job-nginx-0" [0f0da6b3-dbbb-4540-b251-c473a1130321] Pending
helpers_test.go:353: "test-job-nginx-0" [0f0da6b3-dbbb-4540-b251-c473a1130321] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])
helpers_test.go:353: "test-job-nginx-0" [0f0da6b3-dbbb-4540-b251-c473a1130321] Running
addons_test.go:937: (dbg) TestAddons/serial/Volcano: volcano.sh/job-name=test-job healthy within 11.003657091s
addons_test.go:1055: (dbg) Run:  out/minikube-linux-arm64 -p addons-064622 addons disable volcano --alsologtostderr -v=1
addons_test.go:1055: (dbg) Done: out/minikube-linux-arm64 -p addons-064622 addons disable volcano --alsologtostderr -v=1: (12.094819598s)
--- PASS: TestAddons/serial/Volcano (41.75s)

                                                
                                    
x
+
TestAddons/serial/GCPAuth/Namespaces (0.16s)

                                                
                                                
=== RUN   TestAddons/serial/GCPAuth/Namespaces
addons_test.go:632: (dbg) Run:  kubectl --context addons-064622 create ns new-namespace
addons_test.go:646: (dbg) Run:  kubectl --context addons-064622 get secret gcp-auth -n new-namespace
--- PASS: TestAddons/serial/GCPAuth/Namespaces (0.16s)

                                                
                                    
x
+
TestAddons/serial/GCPAuth/FakeCredentials (9.82s)

                                                
                                                
=== RUN   TestAddons/serial/GCPAuth/FakeCredentials
addons_test.go:677: (dbg) Run:  kubectl --context addons-064622 create -f testdata/busybox.yaml
addons_test.go:684: (dbg) Run:  kubectl --context addons-064622 create sa gcp-auth-test
addons_test.go:690: (dbg) TestAddons/serial/GCPAuth/FakeCredentials: waiting 8m0s for pods matching "integration-test=busybox" in namespace "default" ...
helpers_test.go:353: "busybox" [6f410fc5-00b5-4e3c-930d-80a3c6d33310] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
helpers_test.go:353: "busybox" [6f410fc5-00b5-4e3c-930d-80a3c6d33310] Running
addons_test.go:690: (dbg) TestAddons/serial/GCPAuth/FakeCredentials: integration-test=busybox healthy within 9.003653549s
addons_test.go:696: (dbg) Run:  kubectl --context addons-064622 exec busybox -- /bin/sh -c "printenv GOOGLE_APPLICATION_CREDENTIALS"
addons_test.go:708: (dbg) Run:  kubectl --context addons-064622 describe sa gcp-auth-test
addons_test.go:722: (dbg) Run:  kubectl --context addons-064622 exec busybox -- /bin/sh -c "cat /google-app-creds.json"
addons_test.go:746: (dbg) Run:  kubectl --context addons-064622 exec busybox -- /bin/sh -c "printenv GOOGLE_CLOUD_PROJECT"
--- PASS: TestAddons/serial/GCPAuth/FakeCredentials (9.82s)

                                                
                                    
x
+
TestAddons/parallel/Registry (16.64s)

                                                
                                                
=== RUN   TestAddons/parallel/Registry
=== PAUSE TestAddons/parallel/Registry

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Registry
addons_test.go:384: registry stabilized in 5.050141ms
addons_test.go:386: (dbg) TestAddons/parallel/Registry: waiting 6m0s for pods matching "actual-registry=true" in namespace "kube-system" ...
helpers_test.go:353: "registry-6b586f9694-cxrlk" [829762a0-ddd9-427d-935e-5d89f4a54fc5] Running
addons_test.go:386: (dbg) TestAddons/parallel/Registry: actual-registry=true healthy within 6.00317307s
addons_test.go:389: (dbg) TestAddons/parallel/Registry: waiting 10m0s for pods matching "registry-proxy=true" in namespace "kube-system" ...
helpers_test.go:353: "registry-proxy-8s2nw" [5ff0696e-40a9-4940-ba46-a27b35fafef7] Running
addons_test.go:389: (dbg) TestAddons/parallel/Registry: registry-proxy=true healthy within 5.003859083s
addons_test.go:394: (dbg) Run:  kubectl --context addons-064622 delete po -l run=registry-test --now
addons_test.go:399: (dbg) Run:  kubectl --context addons-064622 run --rm registry-test --restart=Never --image=gcr.io/k8s-minikube/busybox -it -- sh -c "wget --spider -S http://registry.kube-system.svc.cluster.local"
addons_test.go:399: (dbg) Done: kubectl --context addons-064622 run --rm registry-test --restart=Never --image=gcr.io/k8s-minikube/busybox -it -- sh -c "wget --spider -S http://registry.kube-system.svc.cluster.local": (4.56328808s)
addons_test.go:413: (dbg) Run:  out/minikube-linux-arm64 -p addons-064622 ip
2025/12/19 05:46:44 [DEBUG] GET http://192.168.49.2:5000
addons_test.go:1055: (dbg) Run:  out/minikube-linux-arm64 -p addons-064622 addons disable registry --alsologtostderr -v=1
--- PASS: TestAddons/parallel/Registry (16.64s)

                                                
                                    
x
+
TestAddons/parallel/RegistryCreds (0.77s)

                                                
                                                
=== RUN   TestAddons/parallel/RegistryCreds
=== PAUSE TestAddons/parallel/RegistryCreds

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/RegistryCreds
addons_test.go:325: registry-creds stabilized in 3.00709ms
addons_test.go:327: (dbg) Run:  out/minikube-linux-arm64 addons configure registry-creds -f ./testdata/addons_testconfig.json -p addons-064622
addons_test.go:334: (dbg) Run:  kubectl --context addons-064622 -n kube-system get secret -o yaml
addons_test.go:1055: (dbg) Run:  out/minikube-linux-arm64 -p addons-064622 addons disable registry-creds --alsologtostderr -v=1
--- PASS: TestAddons/parallel/RegistryCreds (0.77s)

                                                
                                    
x
+
TestAddons/parallel/Ingress (18.58s)

                                                
                                                
=== RUN   TestAddons/parallel/Ingress
=== PAUSE TestAddons/parallel/Ingress

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Ingress
addons_test.go:211: (dbg) Run:  kubectl --context addons-064622 wait --for=condition=ready --namespace=ingress-nginx pod --selector=app.kubernetes.io/component=controller --timeout=90s
addons_test.go:236: (dbg) Run:  kubectl --context addons-064622 replace --force -f testdata/nginx-ingress-v1.yaml
addons_test.go:249: (dbg) Run:  kubectl --context addons-064622 replace --force -f testdata/nginx-pod-svc.yaml
addons_test.go:254: (dbg) TestAddons/parallel/Ingress: waiting 8m0s for pods matching "run=nginx" in namespace "default" ...
helpers_test.go:353: "nginx" [100ada84-1c8f-4f65-a19a-eb32ec2ee1aa] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])
helpers_test.go:353: "nginx" [100ada84-1c8f-4f65-a19a-eb32ec2ee1aa] Running
addons_test.go:254: (dbg) TestAddons/parallel/Ingress: run=nginx healthy within 7.003053075s
I1219 05:47:12.357432 2000386 kapi.go:150] Service nginx in namespace default found.
addons_test.go:266: (dbg) Run:  out/minikube-linux-arm64 -p addons-064622 ssh "curl -s http://127.0.0.1/ -H 'Host: nginx.example.com'"
addons_test.go:290: (dbg) Run:  kubectl --context addons-064622 replace --force -f testdata/ingress-dns-example-v1.yaml
addons_test.go:295: (dbg) Run:  out/minikube-linux-arm64 -p addons-064622 ip
addons_test.go:301: (dbg) Run:  nslookup hello-john.test 192.168.49.2
addons_test.go:1055: (dbg) Run:  out/minikube-linux-arm64 -p addons-064622 addons disable ingress-dns --alsologtostderr -v=1
addons_test.go:1055: (dbg) Done: out/minikube-linux-arm64 -p addons-064622 addons disable ingress-dns --alsologtostderr -v=1: (1.716253096s)
addons_test.go:1055: (dbg) Run:  out/minikube-linux-arm64 -p addons-064622 addons disable ingress --alsologtostderr -v=1
addons_test.go:1055: (dbg) Done: out/minikube-linux-arm64 -p addons-064622 addons disable ingress --alsologtostderr -v=1: (7.955471056s)
--- PASS: TestAddons/parallel/Ingress (18.58s)

                                                
                                    
x
+
TestAddons/parallel/InspektorGadget (11.86s)

                                                
                                                
=== RUN   TestAddons/parallel/InspektorGadget
=== PAUSE TestAddons/parallel/InspektorGadget

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/InspektorGadget
addons_test.go:825: (dbg) TestAddons/parallel/InspektorGadget: waiting 8m0s for pods matching "k8s-app=gadget" in namespace "gadget" ...
helpers_test.go:353: "gadget-xqcgl" [f68cef84-a64f-4c45-8e58-28f70c9dfca5] Running
addons_test.go:825: (dbg) TestAddons/parallel/InspektorGadget: k8s-app=gadget healthy within 6.003745258s
addons_test.go:1055: (dbg) Run:  out/minikube-linux-arm64 -p addons-064622 addons disable inspektor-gadget --alsologtostderr -v=1
addons_test.go:1055: (dbg) Done: out/minikube-linux-arm64 -p addons-064622 addons disable inspektor-gadget --alsologtostderr -v=1: (5.857287899s)
--- PASS: TestAddons/parallel/InspektorGadget (11.86s)

                                                
                                    
x
+
TestAddons/parallel/MetricsServer (6.96s)

                                                
                                                
=== RUN   TestAddons/parallel/MetricsServer
=== PAUSE TestAddons/parallel/MetricsServer

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/MetricsServer
addons_test.go:457: metrics-server stabilized in 4.124198ms
addons_test.go:459: (dbg) TestAddons/parallel/MetricsServer: waiting 6m0s for pods matching "k8s-app=metrics-server" in namespace "kube-system" ...
helpers_test.go:353: "metrics-server-85b7d694d7-g2xlj" [f39c0d30-3c81-4e73-bf8c-20fac131eb0d] Running
addons_test.go:459: (dbg) TestAddons/parallel/MetricsServer: k8s-app=metrics-server healthy within 6.004028192s
addons_test.go:465: (dbg) Run:  kubectl --context addons-064622 top pods -n kube-system
addons_test.go:1055: (dbg) Run:  out/minikube-linux-arm64 -p addons-064622 addons disable metrics-server --alsologtostderr -v=1
--- PASS: TestAddons/parallel/MetricsServer (6.96s)

                                                
                                    
x
+
TestAddons/parallel/CSI (49.33s)

                                                
                                                
=== RUN   TestAddons/parallel/CSI
=== PAUSE TestAddons/parallel/CSI

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/CSI
I1219 05:46:45.068366 2000386 kapi.go:75] Waiting for pod with label "kubernetes.io/minikube-addons=csi-hostpath-driver" in ns "kube-system" ...
I1219 05:46:45.072528 2000386 kapi.go:86] Found 3 Pods for label selector kubernetes.io/minikube-addons=csi-hostpath-driver
I1219 05:46:45.072566 2000386 kapi.go:107] duration metric: took 29.543711ms to wait for kubernetes.io/minikube-addons=csi-hostpath-driver ...
addons_test.go:551: csi-hostpath-driver pods stabilized in 29.557422ms
addons_test.go:554: (dbg) Run:  kubectl --context addons-064622 create -f testdata/csi-hostpath-driver/pvc.yaml
addons_test.go:559: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pvc "hpvc" in namespace "default" ...
helpers_test.go:403: (dbg) Run:  kubectl --context addons-064622 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-064622 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-064622 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-064622 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-064622 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-064622 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-064622 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-064622 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-064622 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-064622 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-064622 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-064622 get pvc hpvc -o jsonpath={.status.phase} -n default
addons_test.go:564: (dbg) Run:  kubectl --context addons-064622 create -f testdata/csi-hostpath-driver/pv-pod.yaml
addons_test.go:569: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pods matching "app=task-pv-pod" in namespace "default" ...
helpers_test.go:353: "task-pv-pod" [3d440c72-c94d-4ffe-99b6-fc07a3b92b5a] Pending
helpers_test.go:353: "task-pv-pod" [3d440c72-c94d-4ffe-99b6-fc07a3b92b5a] Pending / Ready:ContainersNotReady (containers with unready status: [task-pv-container]) / ContainersReady:ContainersNotReady (containers with unready status: [task-pv-container])
helpers_test.go:353: "task-pv-pod" [3d440c72-c94d-4ffe-99b6-fc07a3b92b5a] Running
addons_test.go:569: (dbg) TestAddons/parallel/CSI: app=task-pv-pod healthy within 8.005287059s
addons_test.go:574: (dbg) Run:  kubectl --context addons-064622 create -f testdata/csi-hostpath-driver/snapshot.yaml
addons_test.go:579: (dbg) TestAddons/parallel/CSI: waiting 6m0s for volume snapshot "new-snapshot-demo" in namespace "default" ...
helpers_test.go:428: (dbg) Run:  kubectl --context addons-064622 get volumesnapshot new-snapshot-demo -o jsonpath={.status.readyToUse} -n default
helpers_test.go:428: (dbg) Run:  kubectl --context addons-064622 get volumesnapshot new-snapshot-demo -o jsonpath={.status.readyToUse} -n default
addons_test.go:584: (dbg) Run:  kubectl --context addons-064622 delete pod task-pv-pod
addons_test.go:584: (dbg) Done: kubectl --context addons-064622 delete pod task-pv-pod: (1.236676679s)
addons_test.go:590: (dbg) Run:  kubectl --context addons-064622 delete pvc hpvc
addons_test.go:596: (dbg) Run:  kubectl --context addons-064622 create -f testdata/csi-hostpath-driver/pvc-restore.yaml
addons_test.go:601: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pvc "hpvc-restore" in namespace "default" ...
helpers_test.go:403: (dbg) Run:  kubectl --context addons-064622 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-064622 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-064622 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-064622 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-064622 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-064622 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-064622 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-064622 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-064622 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-064622 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-064622 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
addons_test.go:606: (dbg) Run:  kubectl --context addons-064622 create -f testdata/csi-hostpath-driver/pv-pod-restore.yaml
addons_test.go:611: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pods matching "app=task-pv-pod-restore" in namespace "default" ...
helpers_test.go:353: "task-pv-pod-restore" [c9937ec1-b676-45ba-a154-fa60260b262c] Pending
helpers_test.go:353: "task-pv-pod-restore" [c9937ec1-b676-45ba-a154-fa60260b262c] Pending / Ready:ContainersNotReady (containers with unready status: [task-pv-container]) / ContainersReady:ContainersNotReady (containers with unready status: [task-pv-container])
helpers_test.go:353: "task-pv-pod-restore" [c9937ec1-b676-45ba-a154-fa60260b262c] Running
addons_test.go:611: (dbg) TestAddons/parallel/CSI: app=task-pv-pod-restore healthy within 8.004497129s
addons_test.go:616: (dbg) Run:  kubectl --context addons-064622 delete pod task-pv-pod-restore
addons_test.go:620: (dbg) Run:  kubectl --context addons-064622 delete pvc hpvc-restore
addons_test.go:624: (dbg) Run:  kubectl --context addons-064622 delete volumesnapshot new-snapshot-demo
addons_test.go:1055: (dbg) Run:  out/minikube-linux-arm64 -p addons-064622 addons disable volumesnapshots --alsologtostderr -v=1
addons_test.go:1055: (dbg) Run:  out/minikube-linux-arm64 -p addons-064622 addons disable csi-hostpath-driver --alsologtostderr -v=1
addons_test.go:1055: (dbg) Done: out/minikube-linux-arm64 -p addons-064622 addons disable csi-hostpath-driver --alsologtostderr -v=1: (6.976757848s)
--- PASS: TestAddons/parallel/CSI (49.33s)

                                                
                                    
x
+
TestAddons/parallel/Headlamp (17.33s)

                                                
                                                
=== RUN   TestAddons/parallel/Headlamp
=== PAUSE TestAddons/parallel/Headlamp

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Headlamp
addons_test.go:810: (dbg) Run:  out/minikube-linux-arm64 addons enable headlamp -p addons-064622 --alsologtostderr -v=1
addons_test.go:810: (dbg) Done: out/minikube-linux-arm64 addons enable headlamp -p addons-064622 --alsologtostderr -v=1: (1.101320764s)
addons_test.go:815: (dbg) TestAddons/parallel/Headlamp: waiting 8m0s for pods matching "app.kubernetes.io/name=headlamp" in namespace "headlamp" ...
helpers_test.go:353: "headlamp-dfcdc64b-sc7gv" [b53d4617-f053-4838-9157-7340cd942a11] Pending
helpers_test.go:353: "headlamp-dfcdc64b-sc7gv" [b53d4617-f053-4838-9157-7340cd942a11] Pending / Ready:ContainersNotReady (containers with unready status: [headlamp]) / ContainersReady:ContainersNotReady (containers with unready status: [headlamp])
helpers_test.go:353: "headlamp-dfcdc64b-sc7gv" [b53d4617-f053-4838-9157-7340cd942a11] Running
addons_test.go:815: (dbg) TestAddons/parallel/Headlamp: app.kubernetes.io/name=headlamp healthy within 10.004033914s
addons_test.go:1055: (dbg) Run:  out/minikube-linux-arm64 -p addons-064622 addons disable headlamp --alsologtostderr -v=1
addons_test.go:1055: (dbg) Done: out/minikube-linux-arm64 -p addons-064622 addons disable headlamp --alsologtostderr -v=1: (6.222949347s)
--- PASS: TestAddons/parallel/Headlamp (17.33s)

                                                
                                    
x
+
TestAddons/parallel/CloudSpanner (5.62s)

                                                
                                                
=== RUN   TestAddons/parallel/CloudSpanner
=== PAUSE TestAddons/parallel/CloudSpanner

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/CloudSpanner
addons_test.go:842: (dbg) TestAddons/parallel/CloudSpanner: waiting 6m0s for pods matching "app=cloud-spanner-emulator" in namespace "default" ...
helpers_test.go:353: "cloud-spanner-emulator-5bdddb765-prq6h" [80a81a9a-3b45-41c4-a4a5-f03c99231d48] Running
addons_test.go:842: (dbg) TestAddons/parallel/CloudSpanner: app=cloud-spanner-emulator healthy within 5.003561322s
addons_test.go:1055: (dbg) Run:  out/minikube-linux-arm64 -p addons-064622 addons disable cloud-spanner --alsologtostderr -v=1
--- PASS: TestAddons/parallel/CloudSpanner (5.62s)

                                                
                                    
x
+
TestAddons/parallel/LocalPath (51.05s)

                                                
                                                
=== RUN   TestAddons/parallel/LocalPath
=== PAUSE TestAddons/parallel/LocalPath

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/LocalPath
addons_test.go:951: (dbg) Run:  kubectl --context addons-064622 apply -f testdata/storage-provisioner-rancher/pvc.yaml
addons_test.go:957: (dbg) Run:  kubectl --context addons-064622 apply -f testdata/storage-provisioner-rancher/pod.yaml
addons_test.go:961: (dbg) TestAddons/parallel/LocalPath: waiting 5m0s for pvc "test-pvc" in namespace "default" ...
helpers_test.go:403: (dbg) Run:  kubectl --context addons-064622 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-064622 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-064622 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-064622 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:403: (dbg) Run:  kubectl --context addons-064622 get pvc test-pvc -o jsonpath={.status.phase} -n default
addons_test.go:964: (dbg) TestAddons/parallel/LocalPath: waiting 3m0s for pods matching "run=test-local-path" in namespace "default" ...
helpers_test.go:353: "test-local-path" [c617d939-0bc7-4984-bc9e-1664e4c57fcd] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
helpers_test.go:353: "test-local-path" [c617d939-0bc7-4984-bc9e-1664e4c57fcd] Pending / Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
helpers_test.go:353: "test-local-path" [c617d939-0bc7-4984-bc9e-1664e4c57fcd] Succeeded / Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
addons_test.go:964: (dbg) TestAddons/parallel/LocalPath: run=test-local-path healthy within 3.003167078s
addons_test.go:969: (dbg) Run:  kubectl --context addons-064622 get pvc test-pvc -o=json
addons_test.go:978: (dbg) Run:  out/minikube-linux-arm64 -p addons-064622 ssh "cat /opt/local-path-provisioner/pvc-eea8f01a-5901-4924-b598-e3398f5a7aa4_default_test-pvc/file1"
addons_test.go:990: (dbg) Run:  kubectl --context addons-064622 delete pod test-local-path
addons_test.go:994: (dbg) Run:  kubectl --context addons-064622 delete pvc test-pvc
addons_test.go:1055: (dbg) Run:  out/minikube-linux-arm64 -p addons-064622 addons disable storage-provisioner-rancher --alsologtostderr -v=1
addons_test.go:1055: (dbg) Done: out/minikube-linux-arm64 -p addons-064622 addons disable storage-provisioner-rancher --alsologtostderr -v=1: (42.904029338s)
--- PASS: TestAddons/parallel/LocalPath (51.05s)

                                                
                                    
x
+
TestAddons/parallel/NvidiaDevicePlugin (5.64s)

                                                
                                                
=== RUN   TestAddons/parallel/NvidiaDevicePlugin
=== PAUSE TestAddons/parallel/NvidiaDevicePlugin

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/NvidiaDevicePlugin
addons_test.go:1027: (dbg) TestAddons/parallel/NvidiaDevicePlugin: waiting 6m0s for pods matching "name=nvidia-device-plugin-ds" in namespace "kube-system" ...
helpers_test.go:353: "nvidia-device-plugin-daemonset-2vhd9" [4f253db4-9965-4a98-af61-de87c16246ae] Running
addons_test.go:1027: (dbg) TestAddons/parallel/NvidiaDevicePlugin: name=nvidia-device-plugin-ds healthy within 5.007813376s
addons_test.go:1055: (dbg) Run:  out/minikube-linux-arm64 -p addons-064622 addons disable nvidia-device-plugin --alsologtostderr -v=1
--- PASS: TestAddons/parallel/NvidiaDevicePlugin (5.64s)

                                                
                                    
x
+
TestAddons/parallel/Yakd (11.9s)

                                                
                                                
=== RUN   TestAddons/parallel/Yakd
=== PAUSE TestAddons/parallel/Yakd

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Yakd
addons_test.go:1049: (dbg) TestAddons/parallel/Yakd: waiting 2m0s for pods matching "app.kubernetes.io/name=yakd-dashboard" in namespace "yakd-dashboard" ...
helpers_test.go:353: "yakd-dashboard-6654c87f9b-jp96m" [016db717-2c01-4c14-aece-1e4c5dd7198b] Running
addons_test.go:1049: (dbg) TestAddons/parallel/Yakd: app.kubernetes.io/name=yakd-dashboard healthy within 6.003319757s
addons_test.go:1055: (dbg) Run:  out/minikube-linux-arm64 -p addons-064622 addons disable yakd --alsologtostderr -v=1
addons_test.go:1055: (dbg) Done: out/minikube-linux-arm64 -p addons-064622 addons disable yakd --alsologtostderr -v=1: (5.899874553s)
--- PASS: TestAddons/parallel/Yakd (11.90s)

                                                
                                    
x
+
TestAddons/StoppedEnableDisable (12.4s)

                                                
                                                
=== RUN   TestAddons/StoppedEnableDisable
addons_test.go:174: (dbg) Run:  out/minikube-linux-arm64 stop -p addons-064622
addons_test.go:174: (dbg) Done: out/minikube-linux-arm64 stop -p addons-064622: (12.075985901s)
addons_test.go:178: (dbg) Run:  out/minikube-linux-arm64 addons enable dashboard -p addons-064622
addons_test.go:182: (dbg) Run:  out/minikube-linux-arm64 addons disable dashboard -p addons-064622
addons_test.go:187: (dbg) Run:  out/minikube-linux-arm64 addons disable gvisor -p addons-064622
--- PASS: TestAddons/StoppedEnableDisable (12.40s)

                                                
                                    
x
+
TestCertOptions (41.18s)

                                                
                                                
=== RUN   TestCertOptions
=== PAUSE TestCertOptions

                                                
                                                

                                                
                                                
=== CONT  TestCertOptions
cert_options_test.go:49: (dbg) Run:  out/minikube-linux-arm64 start -p cert-options-038010 --memory=3072 --apiserver-ips=127.0.0.1 --apiserver-ips=192.168.15.15 --apiserver-names=localhost --apiserver-names=www.google.com --apiserver-port=8555 --driver=docker  --container-runtime=containerd
cert_options_test.go:49: (dbg) Done: out/minikube-linux-arm64 start -p cert-options-038010 --memory=3072 --apiserver-ips=127.0.0.1 --apiserver-ips=192.168.15.15 --apiserver-names=localhost --apiserver-names=www.google.com --apiserver-port=8555 --driver=docker  --container-runtime=containerd: (37.565389543s)
cert_options_test.go:60: (dbg) Run:  out/minikube-linux-arm64 -p cert-options-038010 ssh "openssl x509 -text -noout -in /var/lib/minikube/certs/apiserver.crt"
cert_options_test.go:88: (dbg) Run:  kubectl --context cert-options-038010 config view
cert_options_test.go:100: (dbg) Run:  out/minikube-linux-arm64 ssh -p cert-options-038010 -- "sudo cat /etc/kubernetes/admin.conf"
helpers_test.go:176: Cleaning up "cert-options-038010" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p cert-options-038010
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p cert-options-038010: (2.514926249s)
--- PASS: TestCertOptions (41.18s)

                                                
                                    
x
+
TestCertExpiration (227.48s)

                                                
                                                
=== RUN   TestCertExpiration
=== PAUSE TestCertExpiration

                                                
                                                

                                                
                                                
=== CONT  TestCertExpiration
cert_options_test.go:123: (dbg) Run:  out/minikube-linux-arm64 start -p cert-expiration-547453 --memory=3072 --cert-expiration=3m --driver=docker  --container-runtime=containerd
E1219 07:07:52.458757 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-125117/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
cert_options_test.go:123: (dbg) Done: out/minikube-linux-arm64 start -p cert-expiration-547453 --memory=3072 --cert-expiration=3m --driver=docker  --container-runtime=containerd: (37.971541s)
cert_options_test.go:131: (dbg) Run:  out/minikube-linux-arm64 start -p cert-expiration-547453 --memory=3072 --cert-expiration=8760h --driver=docker  --container-runtime=containerd
E1219 07:11:29.406621 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-125117/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
cert_options_test.go:131: (dbg) Done: out/minikube-linux-arm64 start -p cert-expiration-547453 --memory=3072 --cert-expiration=8760h --driver=docker  --container-runtime=containerd: (7.002248011s)
helpers_test.go:176: Cleaning up "cert-expiration-547453" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p cert-expiration-547453
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p cert-expiration-547453: (2.501079291s)
--- PASS: TestCertExpiration (227.48s)

                                                
                                    
x
+
TestForceSystemdFlag (33.34s)

                                                
                                                
=== RUN   TestForceSystemdFlag
=== PAUSE TestForceSystemdFlag

                                                
                                                

                                                
                                                
=== CONT  TestForceSystemdFlag
docker_test.go:91: (dbg) Run:  out/minikube-linux-arm64 start -p force-systemd-flag-173900 --memory=3072 --force-systemd --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd
E1219 07:06:29.406244 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-125117/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
docker_test.go:91: (dbg) Done: out/minikube-linux-arm64 start -p force-systemd-flag-173900 --memory=3072 --force-systemd --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd: (30.911388311s)
docker_test.go:121: (dbg) Run:  out/minikube-linux-arm64 -p force-systemd-flag-173900 ssh "cat /etc/containerd/config.toml"
helpers_test.go:176: Cleaning up "force-systemd-flag-173900" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p force-systemd-flag-173900
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p force-systemd-flag-173900: (2.095721995s)
--- PASS: TestForceSystemdFlag (33.34s)

                                                
                                    
x
+
TestForceSystemdEnv (42.27s)

                                                
                                                
=== RUN   TestForceSystemdEnv
=== PAUSE TestForceSystemdEnv

                                                
                                                

                                                
                                                
=== CONT  TestForceSystemdEnv
docker_test.go:155: (dbg) Run:  out/minikube-linux-arm64 start -p force-systemd-env-631273 --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd
docker_test.go:155: (dbg) Done: out/minikube-linux-arm64 start -p force-systemd-env-631273 --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd: (39.359700608s)
docker_test.go:121: (dbg) Run:  out/minikube-linux-arm64 -p force-systemd-env-631273 ssh "cat /etc/containerd/config.toml"
helpers_test.go:176: Cleaning up "force-systemd-env-631273" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p force-systemd-env-631273
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p force-systemd-env-631273: (2.50234496s)
--- PASS: TestForceSystemdEnv (42.27s)

                                                
                                    
x
+
TestDockerEnvContainerd (47.72s)

                                                
                                                
=== RUN   TestDockerEnvContainerd
docker_test.go:170: running with containerd true linux arm64
docker_test.go:181: (dbg) Run:  out/minikube-linux-arm64 start -p dockerenv-654919 --driver=docker  --container-runtime=containerd
docker_test.go:181: (dbg) Done: out/minikube-linux-arm64 start -p dockerenv-654919 --driver=docker  --container-runtime=containerd: (32.317749319s)
docker_test.go:189: (dbg) Run:  /bin/bash -c "out/minikube-linux-arm64 docker-env --ssh-host --ssh-add -p dockerenv-654919"
docker_test.go:189: (dbg) Done: /bin/bash -c "out/minikube-linux-arm64 docker-env --ssh-host --ssh-add -p dockerenv-654919": (1.089220485s)
docker_test.go:220: (dbg) Run:  /bin/bash -c "SSH_AUTH_SOCK="/tmp/ssh-jA7pKACmquqh/agent.2019918" SSH_AGENT_PID="2019919" DOCKER_HOST=ssh://docker@127.0.0.1:34689 docker version"
docker_test.go:243: (dbg) Run:  /bin/bash -c "SSH_AUTH_SOCK="/tmp/ssh-jA7pKACmquqh/agent.2019918" SSH_AGENT_PID="2019919" DOCKER_HOST=ssh://docker@127.0.0.1:34689 DOCKER_BUILDKIT=0 docker build -t local/minikube-dockerenv-containerd-test:latest testdata/docker-env"
docker_test.go:243: (dbg) Done: /bin/bash -c "SSH_AUTH_SOCK="/tmp/ssh-jA7pKACmquqh/agent.2019918" SSH_AGENT_PID="2019919" DOCKER_HOST=ssh://docker@127.0.0.1:34689 DOCKER_BUILDKIT=0 docker build -t local/minikube-dockerenv-containerd-test:latest testdata/docker-env": (1.248447217s)
docker_test.go:250: (dbg) Run:  /bin/bash -c "SSH_AUTH_SOCK="/tmp/ssh-jA7pKACmquqh/agent.2019918" SSH_AGENT_PID="2019919" DOCKER_HOST=ssh://docker@127.0.0.1:34689 docker image ls"
helpers_test.go:176: Cleaning up "dockerenv-654919" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p dockerenv-654919
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p dockerenv-654919: (2.126189749s)
--- PASS: TestDockerEnvContainerd (47.72s)

                                                
                                    
x
+
TestErrorSpam/setup (32.89s)

                                                
                                                
=== RUN   TestErrorSpam/setup
error_spam_test.go:81: (dbg) Run:  out/minikube-linux-arm64 start -p nospam-899079 -n=1 --memory=3072 --wait=false --log_dir=/tmp/nospam-899079 --driver=docker  --container-runtime=containerd
error_spam_test.go:81: (dbg) Done: out/minikube-linux-arm64 start -p nospam-899079 -n=1 --memory=3072 --wait=false --log_dir=/tmp/nospam-899079 --driver=docker  --container-runtime=containerd: (32.889053875s)
--- PASS: TestErrorSpam/setup (32.89s)

                                                
                                    
x
+
TestErrorSpam/start (0.77s)

                                                
                                                
=== RUN   TestErrorSpam/start
error_spam_test.go:206: Cleaning up 1 logfile(s) ...
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-899079 --log_dir /tmp/nospam-899079 start --dry-run
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-899079 --log_dir /tmp/nospam-899079 start --dry-run
error_spam_test.go:172: (dbg) Run:  out/minikube-linux-arm64 -p nospam-899079 --log_dir /tmp/nospam-899079 start --dry-run
--- PASS: TestErrorSpam/start (0.77s)

                                                
                                    
x
+
TestErrorSpam/status (1.11s)

                                                
                                                
=== RUN   TestErrorSpam/status
error_spam_test.go:206: Cleaning up 0 logfile(s) ...
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-899079 --log_dir /tmp/nospam-899079 status
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-899079 --log_dir /tmp/nospam-899079 status
error_spam_test.go:172: (dbg) Run:  out/minikube-linux-arm64 -p nospam-899079 --log_dir /tmp/nospam-899079 status
--- PASS: TestErrorSpam/status (1.11s)

                                                
                                    
x
+
TestErrorSpam/pause (1.84s)

                                                
                                                
=== RUN   TestErrorSpam/pause
error_spam_test.go:206: Cleaning up 0 logfile(s) ...
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-899079 --log_dir /tmp/nospam-899079 pause
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-899079 --log_dir /tmp/nospam-899079 pause
error_spam_test.go:172: (dbg) Run:  out/minikube-linux-arm64 -p nospam-899079 --log_dir /tmp/nospam-899079 pause
--- PASS: TestErrorSpam/pause (1.84s)

                                                
                                    
x
+
TestErrorSpam/unpause (1.89s)

                                                
                                                
=== RUN   TestErrorSpam/unpause
error_spam_test.go:206: Cleaning up 0 logfile(s) ...
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-899079 --log_dir /tmp/nospam-899079 unpause
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-899079 --log_dir /tmp/nospam-899079 unpause
error_spam_test.go:172: (dbg) Run:  out/minikube-linux-arm64 -p nospam-899079 --log_dir /tmp/nospam-899079 unpause
--- PASS: TestErrorSpam/unpause (1.89s)

                                                
                                    
x
+
TestErrorSpam/stop (1.62s)

                                                
                                                
=== RUN   TestErrorSpam/stop
error_spam_test.go:206: Cleaning up 0 logfile(s) ...
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-899079 --log_dir /tmp/nospam-899079 stop
error_spam_test.go:149: (dbg) Done: out/minikube-linux-arm64 -p nospam-899079 --log_dir /tmp/nospam-899079 stop: (1.423590843s)
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-899079 --log_dir /tmp/nospam-899079 stop
error_spam_test.go:172: (dbg) Run:  out/minikube-linux-arm64 -p nospam-899079 --log_dir /tmp/nospam-899079 stop
--- PASS: TestErrorSpam/stop (1.62s)

                                                
                                    
x
+
TestFunctional/serial/CopySyncFile (0s)

                                                
                                                
=== RUN   TestFunctional/serial/CopySyncFile
functional_test.go:1860: local sync path: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/test/nested/copy/2000386/hosts
--- PASS: TestFunctional/serial/CopySyncFile (0.00s)

                                                
                                    
x
+
TestFunctional/serial/StartWithProxy (51.77s)

                                                
                                                
=== RUN   TestFunctional/serial/StartWithProxy
functional_test.go:2239: (dbg) Run:  out/minikube-linux-arm64 start -p functional-125117 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd
E1219 05:50:27.488828 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/addons-064622/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 05:50:27.494355 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/addons-064622/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 05:50:27.504611 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/addons-064622/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 05:50:27.524866 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/addons-064622/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 05:50:27.565147 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/addons-064622/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 05:50:27.645464 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/addons-064622/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 05:50:27.805860 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/addons-064622/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 05:50:28.126415 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/addons-064622/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 05:50:28.767304 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/addons-064622/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 05:50:30.047621 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/addons-064622/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 05:50:32.608912 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/addons-064622/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 05:50:37.729157 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/addons-064622/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 05:50:47.970257 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/addons-064622/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
functional_test.go:2239: (dbg) Done: out/minikube-linux-arm64 start -p functional-125117 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd: (51.76608795s)
--- PASS: TestFunctional/serial/StartWithProxy (51.77s)

                                                
                                    
x
+
TestFunctional/serial/AuditLog (0s)

                                                
                                                
=== RUN   TestFunctional/serial/AuditLog
--- PASS: TestFunctional/serial/AuditLog (0.00s)

                                                
                                    
x
+
TestFunctional/serial/SoftStart (7.39s)

                                                
                                                
=== RUN   TestFunctional/serial/SoftStart
I1219 05:51:07.930727 2000386 config.go:182] Loaded profile config "functional-125117": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
functional_test.go:674: (dbg) Run:  out/minikube-linux-arm64 start -p functional-125117 --alsologtostderr -v=8
E1219 05:51:08.450566 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/addons-064622/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
functional_test.go:674: (dbg) Done: out/minikube-linux-arm64 start -p functional-125117 --alsologtostderr -v=8: (7.39190693s)
functional_test.go:678: soft start took 7.394669857s for "functional-125117" cluster.
I1219 05:51:15.322952 2000386 config.go:182] Loaded profile config "functional-125117": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
--- PASS: TestFunctional/serial/SoftStart (7.39s)

                                                
                                    
x
+
TestFunctional/serial/KubeContext (0.06s)

                                                
                                                
=== RUN   TestFunctional/serial/KubeContext
functional_test.go:696: (dbg) Run:  kubectl config current-context
--- PASS: TestFunctional/serial/KubeContext (0.06s)

                                                
                                    
x
+
TestFunctional/serial/KubectlGetPods (0.09s)

                                                
                                                
=== RUN   TestFunctional/serial/KubectlGetPods
functional_test.go:711: (dbg) Run:  kubectl --context functional-125117 get po -A
--- PASS: TestFunctional/serial/KubectlGetPods (0.09s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/add_remote (3.54s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/add_remote
functional_test.go:1064: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 cache add registry.k8s.io/pause:3.1
functional_test.go:1064: (dbg) Done: out/minikube-linux-arm64 -p functional-125117 cache add registry.k8s.io/pause:3.1: (1.344807937s)
functional_test.go:1064: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 cache add registry.k8s.io/pause:3.3
functional_test.go:1064: (dbg) Done: out/minikube-linux-arm64 -p functional-125117 cache add registry.k8s.io/pause:3.3: (1.139107871s)
functional_test.go:1064: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 cache add registry.k8s.io/pause:latest
functional_test.go:1064: (dbg) Done: out/minikube-linux-arm64 -p functional-125117 cache add registry.k8s.io/pause:latest: (1.052533938s)
--- PASS: TestFunctional/serial/CacheCmd/cache/add_remote (3.54s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/add_local (1.24s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/add_local
functional_test.go:1092: (dbg) Run:  docker build -t minikube-local-cache-test:functional-125117 /tmp/TestFunctionalserialCacheCmdcacheadd_local2885276149/001
functional_test.go:1104: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 cache add minikube-local-cache-test:functional-125117
functional_test.go:1109: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 cache delete minikube-local-cache-test:functional-125117
functional_test.go:1098: (dbg) Run:  docker rmi minikube-local-cache-test:functional-125117
--- PASS: TestFunctional/serial/CacheCmd/cache/add_local (1.24s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/CacheDelete (0.05s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/CacheDelete
functional_test.go:1117: (dbg) Run:  out/minikube-linux-arm64 cache delete registry.k8s.io/pause:3.3
--- PASS: TestFunctional/serial/CacheCmd/cache/CacheDelete (0.05s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/list (0.05s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/list
functional_test.go:1125: (dbg) Run:  out/minikube-linux-arm64 cache list
--- PASS: TestFunctional/serial/CacheCmd/cache/list (0.05s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node (0.3s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node
functional_test.go:1139: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 ssh sudo crictl images
--- PASS: TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node (0.30s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/cache_reload (1.92s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/cache_reload
functional_test.go:1162: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 ssh sudo crictl rmi registry.k8s.io/pause:latest
functional_test.go:1168: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 ssh sudo crictl inspecti registry.k8s.io/pause:latest
functional_test.go:1168: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-125117 ssh sudo crictl inspecti registry.k8s.io/pause:latest: exit status 1 (294.185473ms)

                                                
                                                
-- stdout --
	FATA[0000] no such image "registry.k8s.io/pause:latest" present 

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test.go:1173: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 cache reload
functional_test.go:1178: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 ssh sudo crictl inspecti registry.k8s.io/pause:latest
--- PASS: TestFunctional/serial/CacheCmd/cache/cache_reload (1.92s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/delete (0.11s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/delete
functional_test.go:1187: (dbg) Run:  out/minikube-linux-arm64 cache delete registry.k8s.io/pause:3.1
functional_test.go:1187: (dbg) Run:  out/minikube-linux-arm64 cache delete registry.k8s.io/pause:latest
--- PASS: TestFunctional/serial/CacheCmd/cache/delete (0.11s)

                                                
                                    
x
+
TestFunctional/serial/MinikubeKubectlCmd (0.14s)

                                                
                                                
=== RUN   TestFunctional/serial/MinikubeKubectlCmd
functional_test.go:731: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 kubectl -- --context functional-125117 get pods
--- PASS: TestFunctional/serial/MinikubeKubectlCmd (0.14s)

                                                
                                    
x
+
TestFunctional/serial/MinikubeKubectlCmdDirectly (0.13s)

                                                
                                                
=== RUN   TestFunctional/serial/MinikubeKubectlCmdDirectly
functional_test.go:756: (dbg) Run:  out/kubectl --context functional-125117 get pods
--- PASS: TestFunctional/serial/MinikubeKubectlCmdDirectly (0.13s)

                                                
                                    
x
+
TestFunctional/serial/ExtraConfig (296.24s)

                                                
                                                
=== RUN   TestFunctional/serial/ExtraConfig
functional_test.go:772: (dbg) Run:  out/minikube-linux-arm64 start -p functional-125117 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all
E1219 05:51:49.410798 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/addons-064622/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 05:53:11.331094 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/addons-064622/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 05:55:27.492149 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/addons-064622/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 05:55:55.172902 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/addons-064622/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
functional_test.go:772: (dbg) Done: out/minikube-linux-arm64 start -p functional-125117 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all: (4m56.242529254s)
functional_test.go:776: restart took 4m56.24264407s for "functional-125117" cluster.
I1219 05:56:19.209038 2000386 config.go:182] Loaded profile config "functional-125117": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
--- PASS: TestFunctional/serial/ExtraConfig (296.24s)

                                                
                                    
x
+
TestFunctional/serial/ComponentHealth (0.13s)

                                                
                                                
=== RUN   TestFunctional/serial/ComponentHealth
functional_test.go:825: (dbg) Run:  kubectl --context functional-125117 get po -l tier=control-plane -n kube-system -o=json
functional_test.go:840: etcd phase: Running
functional_test.go:850: etcd status: Ready
functional_test.go:840: kube-apiserver phase: Running
functional_test.go:850: kube-apiserver status: Ready
functional_test.go:840: kube-controller-manager phase: Running
functional_test.go:850: kube-controller-manager status: Ready
functional_test.go:840: kube-scheduler phase: Running
functional_test.go:850: kube-scheduler status: Ready
--- PASS: TestFunctional/serial/ComponentHealth (0.13s)

                                                
                                    
x
+
TestFunctional/serial/LogsCmd (1.63s)

                                                
                                                
=== RUN   TestFunctional/serial/LogsCmd
functional_test.go:1251: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 logs
functional_test.go:1251: (dbg) Done: out/minikube-linux-arm64 -p functional-125117 logs: (1.628145522s)
--- PASS: TestFunctional/serial/LogsCmd (1.63s)

                                                
                                    
x
+
TestFunctional/serial/LogsFileCmd (1.66s)

                                                
                                                
=== RUN   TestFunctional/serial/LogsFileCmd
functional_test.go:1265: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 logs --file /tmp/TestFunctionalserialLogsFileCmd1890002977/001/logs.txt
functional_test.go:1265: (dbg) Done: out/minikube-linux-arm64 -p functional-125117 logs --file /tmp/TestFunctionalserialLogsFileCmd1890002977/001/logs.txt: (1.661588289s)
--- PASS: TestFunctional/serial/LogsFileCmd (1.66s)

                                                
                                    
x
+
TestFunctional/serial/InvalidService (4.84s)

                                                
                                                
=== RUN   TestFunctional/serial/InvalidService
functional_test.go:2326: (dbg) Run:  kubectl --context functional-125117 apply -f testdata/invalidsvc.yaml
functional_test.go:2340: (dbg) Run:  out/minikube-linux-arm64 service invalid-svc -p functional-125117
functional_test.go:2340: (dbg) Non-zero exit: out/minikube-linux-arm64 service invalid-svc -p functional-125117: exit status 115 (426.942498ms)

                                                
                                                
-- stdout --
	┌───────────┬─────────────┬─────────────┬───────────────────────────┐
	│ NAMESPACE │    NAME     │ TARGET PORT │            URL            │
	├───────────┼─────────────┼─────────────┼───────────────────────────┤
	│ default   │ invalid-svc │ 80          │ http://192.168.49.2:32103 │
	└───────────┴─────────────┴─────────────┴───────────────────────────┘
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to SVC_UNREACHABLE: service not available: no running pod for service invalid-svc found
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_service_96b204199e3191fa1740d4430b018a3c8028d52d_0.log                 │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
functional_test.go:2332: (dbg) Run:  kubectl --context functional-125117 delete -f testdata/invalidsvc.yaml
functional_test.go:2332: (dbg) Done: kubectl --context functional-125117 delete -f testdata/invalidsvc.yaml: (1.166304882s)
--- PASS: TestFunctional/serial/InvalidService (4.84s)

                                                
                                    
x
+
TestFunctional/parallel/ConfigCmd (0.47s)

                                                
                                                
=== RUN   TestFunctional/parallel/ConfigCmd
=== PAUSE TestFunctional/parallel/ConfigCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ConfigCmd
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 config unset cpus
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 config get cpus
functional_test.go:1214: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-125117 config get cpus: exit status 14 (78.103607ms)

                                                
                                                
** stderr ** 
	Error: specified key could not be found in config

                                                
                                                
** /stderr **
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 config set cpus 2
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 config get cpus
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 config unset cpus
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 config get cpus
functional_test.go:1214: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-125117 config get cpus: exit status 14 (84.892906ms)

                                                
                                                
** stderr ** 
	Error: specified key could not be found in config

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/ConfigCmd (0.47s)

                                                
                                    
x
+
TestFunctional/parallel/DryRun (0.59s)

                                                
                                                
=== RUN   TestFunctional/parallel/DryRun
=== PAUSE TestFunctional/parallel/DryRun

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/DryRun
functional_test.go:989: (dbg) Run:  out/minikube-linux-arm64 start -p functional-125117 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd
functional_test.go:989: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p functional-125117 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd: exit status 23 (239.854983ms)

                                                
                                                
-- stdout --
	* [functional-125117] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=22230
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/22230-1998525/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/22230-1998525/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the docker driver based on existing profile
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1219 05:57:00.557830 2047632 out.go:360] Setting OutFile to fd 1 ...
	I1219 05:57:00.557986 2047632 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 05:57:00.557998 2047632 out.go:374] Setting ErrFile to fd 2...
	I1219 05:57:00.558004 2047632 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 05:57:00.558284 2047632 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22230-1998525/.minikube/bin
	I1219 05:57:00.558667 2047632 out.go:368] Setting JSON to false
	I1219 05:57:00.559708 2047632 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":38367,"bootTime":1766085454,"procs":205,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1219 05:57:00.559811 2047632 start.go:143] virtualization:  
	I1219 05:57:00.568387 2047632 out.go:179] * [functional-125117] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1219 05:57:00.570631 2047632 notify.go:221] Checking for updates...
	I1219 05:57:00.571262 2047632 out.go:179]   - MINIKUBE_LOCATION=22230
	I1219 05:57:00.574170 2047632 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1219 05:57:00.577049 2047632 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22230-1998525/kubeconfig
	I1219 05:57:00.579883 2047632 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22230-1998525/.minikube
	I1219 05:57:00.582966 2047632 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1219 05:57:00.585893 2047632 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1219 05:57:00.589272 2047632 config.go:182] Loaded profile config "functional-125117": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
	I1219 05:57:00.589910 2047632 driver.go:422] Setting default libvirt URI to qemu:///system
	I1219 05:57:00.625887 2047632 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1219 05:57:00.626016 2047632 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1219 05:57:00.703132 2047632 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:2 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-19 05:57:00.693315216 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1219 05:57:00.703226 2047632 docker.go:319] overlay module found
	I1219 05:57:00.706369 2047632 out.go:179] * Using the docker driver based on existing profile
	I1219 05:57:00.709245 2047632 start.go:309] selected driver: docker
	I1219 05:57:00.709265 2047632 start.go:928] validating driver "docker" against &{Name:functional-125117 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:functional-125117 Namespace:default APIServerHAVIP: APIServerNa
me:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.34.3 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOpt
ions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1219 05:57:00.709362 2047632 start.go:939] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1219 05:57:00.712868 2047632 out.go:203] 
	W1219 05:57:00.716263 2047632 out.go:285] X Exiting due to RSRC_INSUFFICIENT_REQ_MEMORY: Requested memory allocation 250MiB is less than the usable minimum of 1800MB
	X Exiting due to RSRC_INSUFFICIENT_REQ_MEMORY: Requested memory allocation 250MiB is less than the usable minimum of 1800MB
	I1219 05:57:00.719147 2047632 out.go:203] 

                                                
                                                
** /stderr **
functional_test.go:1006: (dbg) Run:  out/minikube-linux-arm64 start -p functional-125117 --dry-run --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd
--- PASS: TestFunctional/parallel/DryRun (0.59s)

                                                
                                    
x
+
TestFunctional/parallel/InternationalLanguage (0.5s)

                                                
                                                
=== RUN   TestFunctional/parallel/InternationalLanguage
=== PAUSE TestFunctional/parallel/InternationalLanguage

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/InternationalLanguage
functional_test.go:1035: (dbg) Run:  out/minikube-linux-arm64 start -p functional-125117 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd
functional_test.go:1035: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p functional-125117 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd: exit status 23 (500.461751ms)

                                                
                                                
-- stdout --
	* [functional-125117] minikube v1.37.0 sur Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=22230
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/22230-1998525/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/22230-1998525/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Utilisation du pilote docker basé sur le profil existant
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1219 05:57:00.227246 2047543 out.go:360] Setting OutFile to fd 1 ...
	I1219 05:57:00.227507 2047543 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 05:57:00.227535 2047543 out.go:374] Setting ErrFile to fd 2...
	I1219 05:57:00.227559 2047543 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 05:57:00.228201 2047543 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22230-1998525/.minikube/bin
	I1219 05:57:00.228994 2047543 out.go:368] Setting JSON to false
	I1219 05:57:00.230499 2047543 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":38367,"bootTime":1766085454,"procs":207,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1219 05:57:00.230666 2047543 start.go:143] virtualization:  
	I1219 05:57:00.234902 2047543 out.go:179] * [functional-125117] minikube v1.37.0 sur Ubuntu 20.04 (arm64)
	I1219 05:57:00.239101 2047543 notify.go:221] Checking for updates...
	I1219 05:57:00.255591 2047543 out.go:179]   - MINIKUBE_LOCATION=22230
	I1219 05:57:00.263068 2047543 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1219 05:57:00.271894 2047543 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22230-1998525/kubeconfig
	I1219 05:57:00.275157 2047543 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22230-1998525/.minikube
	I1219 05:57:00.278384 2047543 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1219 05:57:00.293782 2047543 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1219 05:57:00.297731 2047543 config.go:182] Loaded profile config "functional-125117": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
	I1219 05:57:00.298574 2047543 driver.go:422] Setting default libvirt URI to qemu:///system
	I1219 05:57:00.376389 2047543 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1219 05:57:00.376555 2047543 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1219 05:57:00.463026 2047543 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:2 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-19 05:57:00.452631334 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1219 05:57:00.463132 2047543 docker.go:319] overlay module found
	I1219 05:57:00.466256 2047543 out.go:179] * Utilisation du pilote docker basé sur le profil existant
	I1219 05:57:00.469140 2047543 start.go:309] selected driver: docker
	I1219 05:57:00.469162 2047543 start.go:928] validating driver "docker" against &{Name:functional-125117 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.3 ClusterName:functional-125117 Namespace:default APIServerHAVIP: APIServerNa
me:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.34.3 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOpt
ions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1219 05:57:00.469274 2047543 start.go:939] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1219 05:57:00.472823 2047543 out.go:203] 
	W1219 05:57:00.475719 2047543 out.go:285] X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	I1219 05:57:00.478576 2047543 out.go:203] 

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/InternationalLanguage (0.50s)

                                                
                                    
x
+
TestFunctional/parallel/StatusCmd (1.33s)

                                                
                                                
=== RUN   TestFunctional/parallel/StatusCmd
=== PAUSE TestFunctional/parallel/StatusCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/StatusCmd
functional_test.go:869: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 status
functional_test.go:875: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 status -f host:{{.Host}},kublet:{{.Kubelet}},apiserver:{{.APIServer}},kubeconfig:{{.Kubeconfig}}
functional_test.go:887: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 status -o json
--- PASS: TestFunctional/parallel/StatusCmd (1.33s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmdConnect (9.61s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmdConnect
=== PAUSE TestFunctional/parallel/ServiceCmdConnect

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ServiceCmdConnect
functional_test.go:1636: (dbg) Run:  kubectl --context functional-125117 create deployment hello-node-connect --image kicbase/echo-server
functional_test.go:1640: (dbg) Run:  kubectl --context functional-125117 expose deployment hello-node-connect --type=NodePort --port=8080
functional_test.go:1645: (dbg) TestFunctional/parallel/ServiceCmdConnect: waiting 10m0s for pods matching "app=hello-node-connect" in namespace "default" ...
helpers_test.go:353: "hello-node-connect-7d85dfc575-fknbj" [bca27054-3fd4-4908-97e5-712de5277783] Pending / Ready:ContainersNotReady (containers with unready status: [echo-server]) / ContainersReady:ContainersNotReady (containers with unready status: [echo-server])
helpers_test.go:353: "hello-node-connect-7d85dfc575-fknbj" [bca27054-3fd4-4908-97e5-712de5277783] Running
functional_test.go:1645: (dbg) TestFunctional/parallel/ServiceCmdConnect: app=hello-node-connect healthy within 9.004156484s
functional_test.go:1654: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 service hello-node-connect --url
functional_test.go:1660: found endpoint for hello-node-connect: http://192.168.49.2:31625
functional_test.go:1680: http://192.168.49.2:31625: success! body:
Request served by hello-node-connect-7d85dfc575-fknbj

                                                
                                                
HTTP/1.1 GET /

                                                
                                                
Host: 192.168.49.2:31625
Accept-Encoding: gzip
User-Agent: Go-http-client/1.1
--- PASS: TestFunctional/parallel/ServiceCmdConnect (9.61s)

                                                
                                    
x
+
TestFunctional/parallel/AddonsCmd (0.14s)

                                                
                                                
=== RUN   TestFunctional/parallel/AddonsCmd
=== PAUSE TestFunctional/parallel/AddonsCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/AddonsCmd
functional_test.go:1695: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 addons list
functional_test.go:1707: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 addons list -o json
--- PASS: TestFunctional/parallel/AddonsCmd (0.14s)

                                                
                                    
x
+
TestFunctional/parallel/PersistentVolumeClaim (18.86s)

                                                
                                                
=== RUN   TestFunctional/parallel/PersistentVolumeClaim
=== PAUSE TestFunctional/parallel/PersistentVolumeClaim

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/PersistentVolumeClaim
functional_test_pvc_test.go:50: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 4m0s for pods matching "integration-test=storage-provisioner" in namespace "kube-system" ...
helpers_test.go:353: "storage-provisioner" [cd167973-9e2c-4a88-9ab5-27714f7c4b79] Running
functional_test_pvc_test.go:50: (dbg) TestFunctional/parallel/PersistentVolumeClaim: integration-test=storage-provisioner healthy within 5.058415197s
functional_test_pvc_test.go:55: (dbg) Run:  kubectl --context functional-125117 get storageclass -o=json
functional_test_pvc_test.go:75: (dbg) Run:  kubectl --context functional-125117 apply -f testdata/storage-provisioner/pvc.yaml
functional_test_pvc_test.go:82: (dbg) Run:  kubectl --context functional-125117 get pvc myclaim -o=json
functional_test_pvc_test.go:131: (dbg) Run:  kubectl --context functional-125117 apply -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:140: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 4m0s for pods matching "test=storage-provisioner" in namespace "default" ...
helpers_test.go:353: "sp-pod" [fb529849-5b3d-477d-af03-a9e5450c0886] Pending
helpers_test.go:353: "sp-pod" [fb529849-5b3d-477d-af03-a9e5450c0886] Running
functional_test_pvc_test.go:140: (dbg) TestFunctional/parallel/PersistentVolumeClaim: test=storage-provisioner healthy within 6.004076897s
functional_test_pvc_test.go:106: (dbg) Run:  kubectl --context functional-125117 exec sp-pod -- touch /tmp/mount/foo
functional_test_pvc_test.go:112: (dbg) Run:  kubectl --context functional-125117 delete -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:131: (dbg) Run:  kubectl --context functional-125117 apply -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:140: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 4m0s for pods matching "test=storage-provisioner" in namespace "default" ...
helpers_test.go:353: "sp-pod" [da015a02-39b7-48a3-918d-e0a9d59c7049] Pending
helpers_test.go:353: "sp-pod" [da015a02-39b7-48a3-918d-e0a9d59c7049] Running
functional_test_pvc_test.go:140: (dbg) TestFunctional/parallel/PersistentVolumeClaim: test=storage-provisioner healthy within 6.005598748s
functional_test_pvc_test.go:120: (dbg) Run:  kubectl --context functional-125117 exec sp-pod -- ls /tmp/mount
--- PASS: TestFunctional/parallel/PersistentVolumeClaim (18.86s)

                                                
                                    
x
+
TestFunctional/parallel/SSHCmd (0.72s)

                                                
                                                
=== RUN   TestFunctional/parallel/SSHCmd
=== PAUSE TestFunctional/parallel/SSHCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/SSHCmd
functional_test.go:1730: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 ssh "echo hello"
functional_test.go:1747: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 ssh "cat /etc/hostname"
--- PASS: TestFunctional/parallel/SSHCmd (0.72s)

                                                
                                    
x
+
TestFunctional/parallel/CpCmd (2.51s)

                                                
                                                
=== RUN   TestFunctional/parallel/CpCmd
=== PAUSE TestFunctional/parallel/CpCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/CpCmd
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 cp testdata/cp-test.txt /home/docker/cp-test.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 ssh -n functional-125117 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 cp functional-125117:/home/docker/cp-test.txt /tmp/TestFunctionalparallelCpCmd239636102/001/cp-test.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 ssh -n functional-125117 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 cp testdata/cp-test.txt /tmp/does/not/exist/cp-test.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 ssh -n functional-125117 "sudo cat /tmp/does/not/exist/cp-test.txt"
--- PASS: TestFunctional/parallel/CpCmd (2.51s)

                                                
                                    
x
+
TestFunctional/parallel/FileSync (0.39s)

                                                
                                                
=== RUN   TestFunctional/parallel/FileSync
=== PAUSE TestFunctional/parallel/FileSync

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/FileSync
functional_test.go:1934: Checking for existence of /etc/test/nested/copy/2000386/hosts within VM
functional_test.go:1936: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 ssh "sudo cat /etc/test/nested/copy/2000386/hosts"
functional_test.go:1941: file sync test content: Test file for checking file sync process
--- PASS: TestFunctional/parallel/FileSync (0.39s)

                                                
                                    
x
+
TestFunctional/parallel/CertSync (2.2s)

                                                
                                                
=== RUN   TestFunctional/parallel/CertSync
=== PAUSE TestFunctional/parallel/CertSync

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/CertSync
functional_test.go:1977: Checking for existence of /etc/ssl/certs/2000386.pem within VM
functional_test.go:1978: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 ssh "sudo cat /etc/ssl/certs/2000386.pem"
functional_test.go:1977: Checking for existence of /usr/share/ca-certificates/2000386.pem within VM
functional_test.go:1978: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 ssh "sudo cat /usr/share/ca-certificates/2000386.pem"
functional_test.go:1977: Checking for existence of /etc/ssl/certs/51391683.0 within VM
functional_test.go:1978: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 ssh "sudo cat /etc/ssl/certs/51391683.0"
functional_test.go:2004: Checking for existence of /etc/ssl/certs/20003862.pem within VM
functional_test.go:2005: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 ssh "sudo cat /etc/ssl/certs/20003862.pem"
functional_test.go:2004: Checking for existence of /usr/share/ca-certificates/20003862.pem within VM
functional_test.go:2005: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 ssh "sudo cat /usr/share/ca-certificates/20003862.pem"
functional_test.go:2004: Checking for existence of /etc/ssl/certs/3ec20f2e.0 within VM
functional_test.go:2005: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 ssh "sudo cat /etc/ssl/certs/3ec20f2e.0"
--- PASS: TestFunctional/parallel/CertSync (2.20s)

                                                
                                    
x
+
TestFunctional/parallel/NodeLabels (0.09s)

                                                
                                                
=== RUN   TestFunctional/parallel/NodeLabels
=== PAUSE TestFunctional/parallel/NodeLabels

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/NodeLabels
functional_test.go:234: (dbg) Run:  kubectl --context functional-125117 get nodes --output=go-template "--template='{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'"
--- PASS: TestFunctional/parallel/NodeLabels (0.09s)

                                                
                                    
x
+
TestFunctional/parallel/NonActiveRuntimeDisabled (0.72s)

                                                
                                                
=== RUN   TestFunctional/parallel/NonActiveRuntimeDisabled
=== PAUSE TestFunctional/parallel/NonActiveRuntimeDisabled

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/NonActiveRuntimeDisabled
functional_test.go:2032: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 ssh "sudo systemctl is-active docker"
functional_test.go:2032: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-125117 ssh "sudo systemctl is-active docker": exit status 1 (366.20541ms)

                                                
                                                
-- stdout --
	inactive

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
functional_test.go:2032: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 ssh "sudo systemctl is-active crio"
functional_test.go:2032: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-125117 ssh "sudo systemctl is-active crio": exit status 1 (356.372033ms)

                                                
                                                
-- stdout --
	inactive

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/NonActiveRuntimeDisabled (0.72s)

                                                
                                    
x
+
TestFunctional/parallel/License (0.32s)

                                                
                                                
=== RUN   TestFunctional/parallel/License
=== PAUSE TestFunctional/parallel/License

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/License
functional_test.go:2293: (dbg) Run:  out/minikube-linux-arm64 license
--- PASS: TestFunctional/parallel/License (0.32s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel (0.74s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel
functional_test_tunnel_test.go:154: (dbg) daemon: [out/minikube-linux-arm64 -p functional-125117 tunnel --alsologtostderr]
functional_test_tunnel_test.go:154: (dbg) daemon: [out/minikube-linux-arm64 -p functional-125117 tunnel --alsologtostderr]
functional_test_tunnel_test.go:194: (dbg) stopping [out/minikube-linux-arm64 -p functional-125117 tunnel --alsologtostderr] ...
helpers_test.go:526: unable to kill pid 2044639: os: process already finished
helpers_test.go:526: unable to kill pid 2044445: os: process already finished
functional_test_tunnel_test.go:194: (dbg) stopping [out/minikube-linux-arm64 -p functional-125117 tunnel --alsologtostderr] ...
helpers_test.go:508: unable to find parent, assuming dead: process does not exist
--- PASS: TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel (0.74s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/StartTunnel (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/StartTunnel
functional_test_tunnel_test.go:129: (dbg) daemon: [out/minikube-linux-arm64 -p functional-125117 tunnel --alsologtostderr]
--- PASS: TestFunctional/parallel/TunnelCmd/serial/StartTunnel (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup (8.54s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup
functional_test_tunnel_test.go:212: (dbg) Run:  kubectl --context functional-125117 apply -f testdata/testsvc.yaml
functional_test_tunnel_test.go:216: (dbg) TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup: waiting 4m0s for pods matching "run=nginx-svc" in namespace "default" ...
helpers_test.go:353: "nginx-svc" [565e3a67-991d-4e71-8a9c-664463902faf] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])
helpers_test.go:353: "nginx-svc" [565e3a67-991d-4e71-8a9c-664463902faf] Running
functional_test_tunnel_test.go:216: (dbg) TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup: run=nginx-svc healthy within 8.003690574s
I1219 05:56:37.946690 2000386 kapi.go:150] Service nginx-svc in namespace default found.
--- PASS: TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup (8.54s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP (0.08s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP
functional_test_tunnel_test.go:234: (dbg) Run:  kubectl --context functional-125117 get svc nginx-svc -o jsonpath={.status.loadBalancer.ingress[0].ip}
--- PASS: TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP (0.08s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/AccessDirect (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/AccessDirect
functional_test_tunnel_test.go:299: tunnel at http://10.107.195.190 is working!
--- PASS: TestFunctional/parallel/TunnelCmd/serial/AccessDirect (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel (0.11s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel
functional_test_tunnel_test.go:434: (dbg) stopping [out/minikube-linux-arm64 -p functional-125117 tunnel --alsologtostderr] ...
functional_test_tunnel_test.go:437: failed to stop process: signal: terminated
--- PASS: TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel (0.11s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/DeployApp (8.23s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/DeployApp
functional_test.go:1451: (dbg) Run:  kubectl --context functional-125117 create deployment hello-node --image kicbase/echo-server
functional_test.go:1455: (dbg) Run:  kubectl --context functional-125117 expose deployment hello-node --type=NodePort --port=8080
functional_test.go:1460: (dbg) TestFunctional/parallel/ServiceCmd/DeployApp: waiting 10m0s for pods matching "app=hello-node" in namespace "default" ...
helpers_test.go:353: "hello-node-75c85bcc94-b9fdv" [4db96bce-27a1-43b6-9d02-538c065c2af0] Pending / Ready:ContainersNotReady (containers with unready status: [echo-server]) / ContainersReady:ContainersNotReady (containers with unready status: [echo-server])
helpers_test.go:353: "hello-node-75c85bcc94-b9fdv" [4db96bce-27a1-43b6-9d02-538c065c2af0] Running
functional_test.go:1460: (dbg) TestFunctional/parallel/ServiceCmd/DeployApp: app=hello-node healthy within 8.003415946s
--- PASS: TestFunctional/parallel/ServiceCmd/DeployApp (8.23s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_not_create (0.45s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_not_create
functional_test.go:1285: (dbg) Run:  out/minikube-linux-arm64 profile lis
functional_test.go:1290: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
--- PASS: TestFunctional/parallel/ProfileCmd/profile_not_create (0.45s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_list (0.48s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_list
functional_test.go:1325: (dbg) Run:  out/minikube-linux-arm64 profile list
functional_test.go:1330: Took "417.905523ms" to run "out/minikube-linux-arm64 profile list"
functional_test.go:1339: (dbg) Run:  out/minikube-linux-arm64 profile list -l
functional_test.go:1344: Took "58.363606ms" to run "out/minikube-linux-arm64 profile list -l"
--- PASS: TestFunctional/parallel/ProfileCmd/profile_list (0.48s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_json_output (0.44s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_json_output
functional_test.go:1376: (dbg) Run:  out/minikube-linux-arm64 profile list -o json
functional_test.go:1381: Took "379.375842ms" to run "out/minikube-linux-arm64 profile list -o json"
functional_test.go:1389: (dbg) Run:  out/minikube-linux-arm64 profile list -o json --light
functional_test.go:1394: Took "59.287883ms" to run "out/minikube-linux-arm64 profile list -o json --light"
--- PASS: TestFunctional/parallel/ProfileCmd/profile_json_output (0.44s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/any-port (8.47s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/any-port
functional_test_mount_test.go:74: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-125117 /tmp/TestFunctionalparallelMountCmdany-port1771311267/001:/mount-9p --alsologtostderr -v=1]
functional_test_mount_test.go:108: wrote "test-1766123810219899224" to /tmp/TestFunctionalparallelMountCmdany-port1771311267/001/created-by-test
functional_test_mount_test.go:108: wrote "test-1766123810219899224" to /tmp/TestFunctionalparallelMountCmdany-port1771311267/001/created-by-test-removed-by-pod
functional_test_mount_test.go:108: wrote "test-1766123810219899224" to /tmp/TestFunctionalparallelMountCmdany-port1771311267/001/test-1766123810219899224
functional_test_mount_test.go:116: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:116: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-125117 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (354.053513ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
I1219 05:56:50.575720 2000386 retry.go:31] will retry after 669.445788ms: exit status 1
functional_test_mount_test.go:116: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:130: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 ssh -- ls -la /mount-9p
functional_test_mount_test.go:134: guest mount directory contents
total 2
-rw-r--r-- 1 docker docker 24 Dec 19 05:56 created-by-test
-rw-r--r-- 1 docker docker 24 Dec 19 05:56 created-by-test-removed-by-pod
-rw-r--r-- 1 docker docker 24 Dec 19 05:56 test-1766123810219899224
functional_test_mount_test.go:138: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 ssh cat /mount-9p/test-1766123810219899224
functional_test_mount_test.go:149: (dbg) Run:  kubectl --context functional-125117 replace --force -f testdata/busybox-mount-test.yaml
functional_test_mount_test.go:154: (dbg) TestFunctional/parallel/MountCmd/any-port: waiting 4m0s for pods matching "integration-test=busybox-mount" in namespace "default" ...
helpers_test.go:353: "busybox-mount" [5e332dfd-c563-435a-986f-72d2b2f5c3ed] Pending
helpers_test.go:353: "busybox-mount" [5e332dfd-c563-435a-986f-72d2b2f5c3ed] Pending / Ready:ContainersNotReady (containers with unready status: [mount-munger]) / ContainersReady:ContainersNotReady (containers with unready status: [mount-munger])
helpers_test.go:353: "busybox-mount" [5e332dfd-c563-435a-986f-72d2b2f5c3ed] Pending / Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
helpers_test.go:353: "busybox-mount" [5e332dfd-c563-435a-986f-72d2b2f5c3ed] Succeeded / Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
functional_test_mount_test.go:154: (dbg) TestFunctional/parallel/MountCmd/any-port: integration-test=busybox-mount healthy within 5.004112587s
functional_test_mount_test.go:170: (dbg) Run:  kubectl --context functional-125117 logs busybox-mount
functional_test_mount_test.go:182: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 ssh stat /mount-9p/created-by-test
functional_test_mount_test.go:182: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 ssh stat /mount-9p/created-by-pod
functional_test_mount_test.go:91: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 ssh "sudo umount -f /mount-9p"
functional_test_mount_test.go:95: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-125117 /tmp/TestFunctionalparallelMountCmdany-port1771311267/001:/mount-9p --alsologtostderr -v=1] ...
--- PASS: TestFunctional/parallel/MountCmd/any-port (8.47s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/List (0.56s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/List
functional_test.go:1469: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 service list
--- PASS: TestFunctional/parallel/ServiceCmd/List (0.56s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/JSONOutput (0.52s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/JSONOutput
functional_test.go:1499: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 service list -o json
functional_test.go:1504: Took "518.406944ms" to run "out/minikube-linux-arm64 -p functional-125117 service list -o json"
--- PASS: TestFunctional/parallel/ServiceCmd/JSONOutput (0.52s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/HTTPS (0.51s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/HTTPS
functional_test.go:1519: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 service --namespace=default --https --url hello-node
functional_test.go:1532: found endpoint: https://192.168.49.2:32677
--- PASS: TestFunctional/parallel/ServiceCmd/HTTPS (0.51s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/Format (0.46s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/Format
functional_test.go:1550: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 service hello-node --url --format={{.IP}}
--- PASS: TestFunctional/parallel/ServiceCmd/Format (0.46s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/URL (0.5s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/URL
functional_test.go:1569: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 service hello-node --url
functional_test.go:1575: found endpoint for hello-node: http://192.168.49.2:32677
--- PASS: TestFunctional/parallel/ServiceCmd/URL (0.50s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/specific-port (2.46s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/specific-port
functional_test_mount_test.go:219: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-125117 /tmp/TestFunctionalparallelMountCmdspecific-port570993593/001:/mount-9p --alsologtostderr -v=1 --port 35193]
functional_test_mount_test.go:249: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:249: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-125117 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (510.702196ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
I1219 05:56:59.197555 2000386 retry.go:31] will retry after 306.626837ms: exit status 1
functional_test_mount_test.go:249: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:263: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 ssh -- ls -la /mount-9p
functional_test_mount_test.go:267: guest mount directory contents
total 0
functional_test_mount_test.go:269: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-125117 /tmp/TestFunctionalparallelMountCmdspecific-port570993593/001:/mount-9p --alsologtostderr -v=1 --port 35193] ...
functional_test_mount_test.go:270: reading mount text
functional_test_mount_test.go:284: done reading mount text
functional_test_mount_test.go:236: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 ssh "sudo umount -f /mount-9p"
functional_test_mount_test.go:236: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-125117 ssh "sudo umount -f /mount-9p": exit status 1 (415.585651ms)

                                                
                                                
-- stdout --
	umount: /mount-9p: not mounted.

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 32

                                                
                                                
** /stderr **
functional_test_mount_test.go:238: "out/minikube-linux-arm64 -p functional-125117 ssh \"sudo umount -f /mount-9p\"": exit status 1
functional_test_mount_test.go:240: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-125117 /tmp/TestFunctionalparallelMountCmdspecific-port570993593/001:/mount-9p --alsologtostderr -v=1 --port 35193] ...
--- PASS: TestFunctional/parallel/MountCmd/specific-port (2.46s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/VerifyCleanup (1.59s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/VerifyCleanup
functional_test_mount_test.go:304: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-125117 /tmp/TestFunctionalparallelMountCmdVerifyCleanup1967260226/001:/mount1 --alsologtostderr -v=1]
functional_test_mount_test.go:304: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-125117 /tmp/TestFunctionalparallelMountCmdVerifyCleanup1967260226/001:/mount2 --alsologtostderr -v=1]
functional_test_mount_test.go:304: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-125117 /tmp/TestFunctionalparallelMountCmdVerifyCleanup1967260226/001:/mount3 --alsologtostderr -v=1]
functional_test_mount_test.go:331: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 ssh "findmnt -T" /mount1
functional_test_mount_test.go:331: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 ssh "findmnt -T" /mount2
functional_test_mount_test.go:331: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 ssh "findmnt -T" /mount3
functional_test_mount_test.go:376: (dbg) Run:  out/minikube-linux-arm64 mount -p functional-125117 --kill=true
functional_test_mount_test.go:319: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-125117 /tmp/TestFunctionalparallelMountCmdVerifyCleanup1967260226/001:/mount1 --alsologtostderr -v=1] ...
helpers_test.go:508: unable to find parent, assuming dead: process does not exist
functional_test_mount_test.go:319: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-125117 /tmp/TestFunctionalparallelMountCmdVerifyCleanup1967260226/001:/mount2 --alsologtostderr -v=1] ...
helpers_test.go:508: unable to find parent, assuming dead: process does not exist
functional_test_mount_test.go:319: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-125117 /tmp/TestFunctionalparallelMountCmdVerifyCleanup1967260226/001:/mount3 --alsologtostderr -v=1] ...
helpers_test.go:508: unable to find parent, assuming dead: process does not exist
--- PASS: TestFunctional/parallel/MountCmd/VerifyCleanup (1.59s)

                                                
                                    
x
+
TestFunctional/parallel/Version/short (0.07s)

                                                
                                                
=== RUN   TestFunctional/parallel/Version/short
=== PAUSE TestFunctional/parallel/Version/short

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/Version/short
functional_test.go:2261: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 version --short
--- PASS: TestFunctional/parallel/Version/short (0.07s)

                                                
                                    
x
+
TestFunctional/parallel/Version/components (1.41s)

                                                
                                                
=== RUN   TestFunctional/parallel/Version/components
=== PAUSE TestFunctional/parallel/Version/components

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/Version/components
functional_test.go:2275: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 version -o=json --components
functional_test.go:2275: (dbg) Done: out/minikube-linux-arm64 -p functional-125117 version -o=json --components: (1.406617098s)
--- PASS: TestFunctional/parallel/Version/components (1.41s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListShort (0.26s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListShort
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListShort

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListShort
functional_test.go:276: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 image ls --format short --alsologtostderr
functional_test.go:281: (dbg) Stdout: out/minikube-linux-arm64 -p functional-125117 image ls --format short --alsologtostderr:
registry.k8s.io/pause:latest
registry.k8s.io/pause:3.3
registry.k8s.io/pause:3.10.1
registry.k8s.io/pause:3.1
registry.k8s.io/kube-scheduler:v1.34.3
registry.k8s.io/kube-proxy:v1.34.3
registry.k8s.io/kube-controller-manager:v1.34.3
registry.k8s.io/kube-apiserver:v1.34.3
registry.k8s.io/etcd:3.6.5-0
registry.k8s.io/coredns/coredns:v1.12.1
public.ecr.aws/nginx/nginx:alpine
gcr.io/k8s-minikube/storage-provisioner:v5
gcr.io/k8s-minikube/busybox:1.28.4-glibc
docker.io/library/minikube-local-cache-test:functional-125117
docker.io/library/kong:3.9
docker.io/kindest/kindnetd:v20251212-v0.29.0-alpha-105-g20ccfc88
docker.io/kindest/kindnetd:v20250512-df8de77b
docker.io/kicbase/echo-server:functional-125117
functional_test.go:284: (dbg) Stderr: out/minikube-linux-arm64 -p functional-125117 image ls --format short --alsologtostderr:
I1219 05:57:15.414927 2050204 out.go:360] Setting OutFile to fd 1 ...
I1219 05:57:15.415106 2050204 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1219 05:57:15.415127 2050204 out.go:374] Setting ErrFile to fd 2...
I1219 05:57:15.415144 2050204 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1219 05:57:15.415397 2050204 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22230-1998525/.minikube/bin
I1219 05:57:15.416017 2050204 config.go:182] Loaded profile config "functional-125117": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
I1219 05:57:15.416168 2050204 config.go:182] Loaded profile config "functional-125117": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
I1219 05:57:15.416709 2050204 cli_runner.go:164] Run: docker container inspect functional-125117 --format={{.State.Status}}
I1219 05:57:15.435505 2050204 ssh_runner.go:195] Run: systemctl --version
I1219 05:57:15.435565 2050204 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-125117
I1219 05:57:15.456091 2050204 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34699 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-125117/id_rsa Username:docker}
I1219 05:57:15.572013 2050204 ssh_runner.go:195] Run: sudo crictl --timeout=10s images --output json
--- PASS: TestFunctional/parallel/ImageCommands/ImageListShort (0.26s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListTable (0.29s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListTable
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListTable

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListTable
functional_test.go:276: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 image ls --format table --alsologtostderr
functional_test.go:281: (dbg) Stdout: out/minikube-linux-arm64 -p functional-125117 image ls --format table --alsologtostderr:
┌─────────────────────────────────────────────┬───────────────────────────────────────┬───────────────┬────────┐
│                    IMAGE                    │                  TAG                  │   IMAGE ID    │  SIZE  │
├─────────────────────────────────────────────┼───────────────────────────────────────┼───────────────┼────────┤
│ docker.io/library/minikube-local-cache-test │ functional-125117                     │ sha256:51d149 │ 992B   │
│ gcr.io/k8s-minikube/storage-provisioner     │ v5                                    │ sha256:ba04bb │ 8.03MB │
│ localhost/my-image                          │ functional-125117                     │ sha256:0017f5 │ 831kB  │
│ public.ecr.aws/nginx/nginx                  │ alpine                                │ sha256:962dbb │ 23MB   │
│ registry.k8s.io/pause                       │ 3.1                                   │ sha256:8057e0 │ 262kB  │
│ registry.k8s.io/pause                       │ latest                                │ sha256:8cb209 │ 71.3kB │
│ docker.io/kicbase/echo-server               │ functional-125117                     │ sha256:ce2d2c │ 2.17MB │
│ registry.k8s.io/kube-apiserver              │ v1.34.3                               │ sha256:cf65ae │ 24.6MB │
│ registry.k8s.io/kube-controller-manager     │ v1.34.3                               │ sha256:7ada8f │ 20.7MB │
│ registry.k8s.io/pause                       │ 3.10.1                                │ sha256:d7b100 │ 268kB  │
│ docker.io/kindest/kindnetd                  │ v20251212-v0.29.0-alpha-105-g20ccfc88 │ sha256:c96ee3 │ 38.5MB │
│ docker.io/kubernetesui/dashboard-api        │ 1.14.0                                │ sha256:85ac4c │ 15MB   │
│ docker.io/library/kong                      │ 3.9                                   │ sha256:2bf86f │ 119MB  │
│ gcr.io/k8s-minikube/busybox                 │ 1.28.4-glibc                          │ sha256:1611cd │ 1.94MB │
│ registry.k8s.io/coredns/coredns             │ v1.12.1                               │ sha256:138784 │ 20.4MB │
│ registry.k8s.io/etcd                        │ 3.6.5-0                               │ sha256:2c5f0d │ 21.1MB │
│ registry.k8s.io/kube-proxy                  │ v1.34.3                               │ sha256:4461da │ 22.8MB │
│ registry.k8s.io/pause                       │ 3.3                                   │ sha256:3d1873 │ 249kB  │
│ docker.io/kindest/kindnetd                  │ v20250512-df8de77b                    │ sha256:b1a8c6 │ 40.6MB │
│ docker.io/kubernetesui/dashboard-web        │ 1.7.0                                 │ sha256:2c51e8 │ 61.1MB │
│ registry.k8s.io/kube-scheduler              │ v1.34.3                               │ sha256:2f2aa2 │ 15.8MB │
└─────────────────────────────────────────────┴───────────────────────────────────────┴───────────────┴────────┘
functional_test.go:284: (dbg) Stderr: out/minikube-linux-arm64 -p functional-125117 image ls --format table --alsologtostderr:
I1219 05:57:20.932375 2052058 out.go:360] Setting OutFile to fd 1 ...
I1219 05:57:20.932467 2052058 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1219 05:57:20.932472 2052058 out.go:374] Setting ErrFile to fd 2...
I1219 05:57:20.932477 2052058 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1219 05:57:20.932726 2052058 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22230-1998525/.minikube/bin
I1219 05:57:20.934111 2052058 config.go:182] Loaded profile config "functional-125117": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
I1219 05:57:20.934264 2052058 config.go:182] Loaded profile config "functional-125117": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
I1219 05:57:20.934990 2052058 cli_runner.go:164] Run: docker container inspect functional-125117 --format={{.State.Status}}
I1219 05:57:20.966587 2052058 ssh_runner.go:195] Run: systemctl --version
I1219 05:57:20.966646 2052058 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-125117
I1219 05:57:20.987207 2052058 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34699 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-125117/id_rsa Username:docker}
I1219 05:57:21.095730 2052058 ssh_runner.go:195] Run: sudo crictl --timeout=10s images --output json
--- PASS: TestFunctional/parallel/ImageCommands/ImageListTable (0.29s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListJson (0.29s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListJson
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListJson

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListJson
functional_test.go:276: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 image ls --format json --alsologtostderr
functional_test.go:281: (dbg) Stdout: out/minikube-linux-arm64 -p functional-125117 image ls --format json --alsologtostderr:
[{"id":"sha256:1611cd07b61d57dbbfebe6db242513fd51e1c02d20ba08af17a45837d86a8a8c","repoDigests":["gcr.io/k8s-minikube/busybox@sha256:2d03e6ceeb99250061dd110530b0ece7998cd84121f952adef120ea7c5a6f00e"],"repoTags":["gcr.io/k8s-minikube/busybox:1.28.4-glibc"],"size":"1935750"},{"id":"sha256:962dbbc0e55ec93371166cf3e1f723875ce281259bb90b8092248398555aff67","repoDigests":["public.ecr.aws/nginx/nginx@sha256:a411c634df4374901a4a9370626801998f159652f627b1cdfbbbe012adcd6c76"],"repoTags":["public.ecr.aws/nginx/nginx:alpine"],"size":"22987510"},{"id":"sha256:8057e0500773a37cde2cff041eb13ebd68c748419a2fbfd1dfb5bf38696cc8e5","repoDigests":[],"repoTags":["registry.k8s.io/pause:3.1"],"size":"262191"},{"id":"sha256:c96ee3c17498748ccc544ba99ee8ffeb020fc335b230b43cd28bf43bed229a13","repoDigests":["docker.io/kindest/kindnetd@sha256:377e2e7a513148f7c942b51cd57bdce1589940df856105384ac7f753a1ab43ae"],"repoTags":["docker.io/kindest/kindnetd:v20251212-v0.29.0-alpha-105-g20ccfc88"],"size":"38502448"},{"id":"sha256:51d14939a1995
a88415fccb269ec40dc043aefbcf5035f79ba02097bb3909863","repoDigests":[],"repoTags":["docker.io/library/minikube-local-cache-test:functional-125117"],"size":"992"},{"id":"sha256:ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6","repoDigests":["gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"],"repoTags":["gcr.io/k8s-minikube/storage-provisioner:v5"],"size":"8034419"},{"id":"sha256:cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896","repoDigests":["registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460"],"repoTags":["registry.k8s.io/kube-apiserver:v1.34.3"],"size":"24567639"},{"id":"sha256:7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22","repoDigests":["registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954"],"repoTags":["registry.k8s.io/kube-controller-manager:v1.34.3"],"size":"20719958"},{"id":"sha2
56:2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6","repoDigests":["registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2"],"repoTags":["registry.k8s.io/kube-scheduler:v1.34.3"],"size":"15776215"},{"id":"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17","repoDigests":[],"repoTags":["docker.io/kicbase/echo-server:functional-125117"],"size":"2173567"},{"id":"sha256:b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c","repoDigests":["docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a"],"repoTags":["docker.io/kindest/kindnetd:v20250512-df8de77b"],"size":"40636774"},{"id":"sha256:2bf86f243d2501de569559bf4860b3f80303583722d53ea1faf49066051de286","repoDigests":["docker.io/library/kong@sha256:4379444ecfd82794b27de38a74ba540e8571683dfdfce74c8ecb4018f308fb29"],"repoTags":["docker.io/library/kong:3.9"],"size":"118876318"},{"id":"sha256:0017f5bb11fe5eeb1e19f8331b75e2de5fc75
a042d3f7935e09cddf1c6817f9d","repoDigests":[],"repoTags":["localhost/my-image:functional-125117"],"size":"830602"},{"id":"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc","repoDigests":["registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c"],"repoTags":["registry.k8s.io/coredns/coredns:v1.12.1"],"size":"20392204"},{"id":"sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42","repoDigests":["registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534"],"repoTags":["registry.k8s.io/etcd:3.6.5-0"],"size":"21136588"},{"id":"sha256:4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162","repoDigests":["registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6"],"repoTags":["registry.k8s.io/kube-proxy:v1.34.3"],"size":"22804272"},{"id":"sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a","repoDigests":[],"repoTags":["reg
istry.k8s.io/pause:latest"],"size":"71300"},{"id":"sha256:85ac4c11285e7563c0b7c0c33244414daac39bcbb147ac42a90c323d489a09d2","repoDigests":["docker.io/kubernetesui/dashboard-api@sha256:96a702cfd3399d9eba23b3d37b09f798a4f51fcd8c8dfa8552c7829ade9c4aff"],"repoTags":["docker.io/kubernetesui/dashboard-api:1.14.0"],"size":"14986407"},{"id":"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd","repoDigests":["registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c"],"repoTags":["registry.k8s.io/pause:3.10.1"],"size":"267939"},{"id":"sha256:3d18732f8686cc3c878055d99a05fa80289502fa496b36b6a0fe0f77206a7300","repoDigests":[],"repoTags":["registry.k8s.io/pause:3.3"],"size":"249461"}]
functional_test.go:284: (dbg) Stderr: out/minikube-linux-arm64 -p functional-125117 image ls --format json --alsologtostderr:
I1219 05:57:20.626550 2052016 out.go:360] Setting OutFile to fd 1 ...
I1219 05:57:20.626763 2052016 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1219 05:57:20.626790 2052016 out.go:374] Setting ErrFile to fd 2...
I1219 05:57:20.626811 2052016 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1219 05:57:20.627089 2052016 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22230-1998525/.minikube/bin
I1219 05:57:20.627726 2052016 config.go:182] Loaded profile config "functional-125117": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
I1219 05:57:20.627895 2052016 config.go:182] Loaded profile config "functional-125117": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
I1219 05:57:20.628445 2052016 cli_runner.go:164] Run: docker container inspect functional-125117 --format={{.State.Status}}
I1219 05:57:20.662372 2052016 ssh_runner.go:195] Run: systemctl --version
I1219 05:57:20.662423 2052016 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-125117
I1219 05:57:20.700826 2052016 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34699 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-125117/id_rsa Username:docker}
I1219 05:57:20.807975 2052016 ssh_runner.go:195] Run: sudo crictl --timeout=10s images --output json
--- PASS: TestFunctional/parallel/ImageCommands/ImageListJson (0.29s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListYaml (0.31s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListYaml
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListYaml

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListYaml
functional_test.go:276: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 image ls --format yaml --alsologtostderr
functional_test.go:281: (dbg) Stdout: out/minikube-linux-arm64 -p functional-125117 image ls --format yaml --alsologtostderr:
- id: sha256:1611cd07b61d57dbbfebe6db242513fd51e1c02d20ba08af17a45837d86a8a8c
repoDigests:
- gcr.io/k8s-minikube/busybox@sha256:2d03e6ceeb99250061dd110530b0ece7998cd84121f952adef120ea7c5a6f00e
repoTags:
- gcr.io/k8s-minikube/busybox:1.28.4-glibc
size: "1935750"
- id: sha256:962dbbc0e55ec93371166cf3e1f723875ce281259bb90b8092248398555aff67
repoDigests:
- public.ecr.aws/nginx/nginx@sha256:a411c634df4374901a4a9370626801998f159652f627b1cdfbbbe012adcd6c76
repoTags:
- public.ecr.aws/nginx/nginx:alpine
size: "22987510"
- id: sha256:2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6
repoDigests:
- registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2
repoTags:
- registry.k8s.io/kube-scheduler:v1.34.3
size: "15776215"
- id: sha256:ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6
repoDigests:
- gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944
repoTags:
- gcr.io/k8s-minikube/storage-provisioner:v5
size: "8034419"
- id: sha256:7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22
repoDigests:
- registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954
repoTags:
- registry.k8s.io/kube-controller-manager:v1.34.3
size: "20719958"
- id: sha256:51d14939a1995a88415fccb269ec40dc043aefbcf5035f79ba02097bb3909863
repoDigests: []
repoTags:
- docker.io/library/minikube-local-cache-test:functional-125117
size: "992"
- id: sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc
repoDigests:
- registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c
repoTags:
- registry.k8s.io/coredns/coredns:v1.12.1
size: "20392204"
- id: sha256:4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162
repoDigests:
- registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6
repoTags:
- registry.k8s.io/kube-proxy:v1.34.3
size: "22804272"
- id: sha256:8057e0500773a37cde2cff041eb13ebd68c748419a2fbfd1dfb5bf38696cc8e5
repoDigests: []
repoTags:
- registry.k8s.io/pause:3.1
size: "262191"
- id: sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd
repoDigests:
- registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c
repoTags:
- registry.k8s.io/pause:3.10.1
size: "267939"
- id: sha256:3d18732f8686cc3c878055d99a05fa80289502fa496b36b6a0fe0f77206a7300
repoDigests: []
repoTags:
- registry.k8s.io/pause:3.3
size: "249461"
- id: sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17
repoDigests: []
repoTags:
- docker.io/kicbase/echo-server:functional-125117
size: "2173567"
- id: sha256:2bf86f243d2501de569559bf4860b3f80303583722d53ea1faf49066051de286
repoDigests:
- docker.io/library/kong@sha256:4379444ecfd82794b27de38a74ba540e8571683dfdfce74c8ecb4018f308fb29
repoTags:
- docker.io/library/kong:3.9
size: "118876318"
- id: sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42
repoDigests:
- registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534
repoTags:
- registry.k8s.io/etcd:3.6.5-0
size: "21136588"
- id: sha256:cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896
repoDigests:
- registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460
repoTags:
- registry.k8s.io/kube-apiserver:v1.34.3
size: "24567639"
- id: sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a
repoDigests: []
repoTags:
- registry.k8s.io/pause:latest
size: "71300"
- id: sha256:b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c
repoDigests:
- docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a
repoTags:
- docker.io/kindest/kindnetd:v20250512-df8de77b
size: "40636774"
- id: sha256:c96ee3c17498748ccc544ba99ee8ffeb020fc335b230b43cd28bf43bed229a13
repoDigests:
- docker.io/kindest/kindnetd@sha256:377e2e7a513148f7c942b51cd57bdce1589940df856105384ac7f753a1ab43ae
repoTags:
- docker.io/kindest/kindnetd:v20251212-v0.29.0-alpha-105-g20ccfc88
size: "38502448"

                                                
                                                
functional_test.go:284: (dbg) Stderr: out/minikube-linux-arm64 -p functional-125117 image ls --format yaml --alsologtostderr:
I1219 05:57:15.677255 2050241 out.go:360] Setting OutFile to fd 1 ...
I1219 05:57:15.677434 2050241 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1219 05:57:15.677446 2050241 out.go:374] Setting ErrFile to fd 2...
I1219 05:57:15.677452 2050241 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1219 05:57:15.677822 2050241 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22230-1998525/.minikube/bin
I1219 05:57:15.678829 2050241 config.go:182] Loaded profile config "functional-125117": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
I1219 05:57:15.678948 2050241 config.go:182] Loaded profile config "functional-125117": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
I1219 05:57:15.679535 2050241 cli_runner.go:164] Run: docker container inspect functional-125117 --format={{.State.Status}}
I1219 05:57:15.707781 2050241 ssh_runner.go:195] Run: systemctl --version
I1219 05:57:15.707852 2050241 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-125117
I1219 05:57:15.736236 2050241 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34699 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-125117/id_rsa Username:docker}
I1219 05:57:15.852729 2050241 ssh_runner.go:195] Run: sudo crictl --timeout=10s images --output json
--- PASS: TestFunctional/parallel/ImageCommands/ImageListYaml (0.31s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageBuild (4.63s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageBuild
=== PAUSE TestFunctional/parallel/ImageCommands/ImageBuild

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageBuild
functional_test.go:323: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 ssh pgrep buildkitd
functional_test.go:323: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-125117 ssh pgrep buildkitd: exit status 1 (399.710541ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test.go:330: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 image build -t localhost/my-image:functional-125117 testdata/build --alsologtostderr
functional_test.go:330: (dbg) Done: out/minikube-linux-arm64 -p functional-125117 image build -t localhost/my-image:functional-125117 testdata/build --alsologtostderr: (3.970501943s)
functional_test.go:338: (dbg) Stderr: out/minikube-linux-arm64 -p functional-125117 image build -t localhost/my-image:functional-125117 testdata/build --alsologtostderr:
I1219 05:57:16.396519 2050348 out.go:360] Setting OutFile to fd 1 ...
I1219 05:57:16.397725 2050348 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1219 05:57:16.397744 2050348 out.go:374] Setting ErrFile to fd 2...
I1219 05:57:16.397751 2050348 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1219 05:57:16.398007 2050348 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22230-1998525/.minikube/bin
I1219 05:57:16.398665 2050348 config.go:182] Loaded profile config "functional-125117": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
I1219 05:57:16.400540 2050348 config.go:182] Loaded profile config "functional-125117": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
I1219 05:57:16.401232 2050348 cli_runner.go:164] Run: docker container inspect functional-125117 --format={{.State.Status}}
I1219 05:57:16.427702 2050348 ssh_runner.go:195] Run: systemctl --version
I1219 05:57:16.427766 2050348 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-125117
I1219 05:57:16.448877 2050348 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34699 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-125117/id_rsa Username:docker}
I1219 05:57:16.559557 2050348 build_images.go:162] Building image from path: /tmp/build.1036943723.tar
I1219 05:57:16.559637 2050348 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/build
I1219 05:57:16.568633 2050348 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/build/build.1036943723.tar
I1219 05:57:16.574961 2050348 ssh_runner.go:352] existence check for /var/lib/minikube/build/build.1036943723.tar: stat -c "%s %y" /var/lib/minikube/build/build.1036943723.tar: Process exited with status 1
stdout:

                                                
                                                
stderr:
stat: cannot statx '/var/lib/minikube/build/build.1036943723.tar': No such file or directory
I1219 05:57:16.574997 2050348 ssh_runner.go:362] scp /tmp/build.1036943723.tar --> /var/lib/minikube/build/build.1036943723.tar (3072 bytes)
I1219 05:57:16.603911 2050348 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/build/build.1036943723
I1219 05:57:16.616679 2050348 ssh_runner.go:195] Run: sudo tar -C /var/lib/minikube/build/build.1036943723 -xf /var/lib/minikube/build/build.1036943723.tar
I1219 05:57:16.630301 2050348 containerd.go:394] Building image: /var/lib/minikube/build/build.1036943723
I1219 05:57:16.630380 2050348 ssh_runner.go:195] Run: sudo buildctl build --frontend dockerfile.v0 --local context=/var/lib/minikube/build/build.1036943723 --local dockerfile=/var/lib/minikube/build/build.1036943723 --output type=image,name=localhost/my-image:functional-125117
#1 [internal] load build definition from Dockerfile
#1 transferring dockerfile: 97B done
#1 DONE 0.1s

                                                
                                                
#2 [internal] load metadata for gcr.io/k8s-minikube/busybox:latest
#2 DONE 1.6s

                                                
                                                
#3 [internal] load .dockerignore
#3 transferring context: 2B done
#3 DONE 0.0s

                                                
                                                
#4 [internal] load build context
#4 transferring context: 62B done
#4 DONE 0.1s

                                                
                                                
#5 [1/3] FROM gcr.io/k8s-minikube/busybox:latest@sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b
#5 resolve gcr.io/k8s-minikube/busybox:latest@sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b
#5 resolve gcr.io/k8s-minikube/busybox:latest@sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b 0.1s done
#5 sha256:a01966dde7f8d5ba10b6d87e776c7c8fb5a5f6bfa678874bd28b33b1fc6dba34 0B / 828.50kB 0.2s
#5 sha256:a01966dde7f8d5ba10b6d87e776c7c8fb5a5f6bfa678874bd28b33b1fc6dba34 828.50kB / 828.50kB 0.4s done
#5 extracting sha256:a01966dde7f8d5ba10b6d87e776c7c8fb5a5f6bfa678874bd28b33b1fc6dba34 0.1s done
#5 DONE 0.6s

                                                
                                                
#6 [2/3] RUN true
#6 DONE 0.7s

                                                
                                                
#7 [3/3] ADD content.txt /
#7 DONE 0.1s

                                                
                                                
#8 exporting to image
#8 exporting layers
#8 exporting layers 0.2s done
#8 exporting manifest sha256:c217441761043640f92321300d5f30abb92993a8ce2108f198343a6b8718abcf
#8 exporting manifest sha256:c217441761043640f92321300d5f30abb92993a8ce2108f198343a6b8718abcf 0.0s done
#8 exporting config sha256:0017f5bb11fe5eeb1e19f8331b75e2de5fc75a042d3f7935e09cddf1c6817f9d 0.0s done
#8 naming to localhost/my-image:functional-125117 done
#8 DONE 0.2s
I1219 05:57:20.260298 2050348 ssh_runner.go:235] Completed: sudo buildctl build --frontend dockerfile.v0 --local context=/var/lib/minikube/build/build.1036943723 --local dockerfile=/var/lib/minikube/build/build.1036943723 --output type=image,name=localhost/my-image:functional-125117: (3.629886773s)
I1219 05:57:20.260374 2050348 ssh_runner.go:195] Run: sudo rm -rf /var/lib/minikube/build/build.1036943723
I1219 05:57:20.272337 2050348 ssh_runner.go:195] Run: sudo rm -f /var/lib/minikube/build/build.1036943723.tar
I1219 05:57:20.284380 2050348 build_images.go:218] Built localhost/my-image:functional-125117 from /tmp/build.1036943723.tar
I1219 05:57:20.284422 2050348 build_images.go:134] succeeded building to: functional-125117
I1219 05:57:20.284428 2050348 build_images.go:135] failed building to: 
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageBuild (4.63s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/Setup (0.64s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/Setup
functional_test.go:357: (dbg) Run:  docker pull kicbase/echo-server:1.0
functional_test.go:362: (dbg) Run:  docker tag kicbase/echo-server:1.0 kicbase/echo-server:functional-125117
--- PASS: TestFunctional/parallel/ImageCommands/Setup (0.64s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageLoadDaemon (1.12s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageLoadDaemon
functional_test.go:370: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 image load --daemon kicbase/echo-server:functional-125117 --alsologtostderr
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageLoadDaemon (1.12s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageReloadDaemon (1.25s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageReloadDaemon
functional_test.go:380: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 image load --daemon kicbase/echo-server:functional-125117 --alsologtostderr
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageReloadDaemon (1.25s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon (1.85s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon
functional_test.go:250: (dbg) Run:  docker pull kicbase/echo-server:latest
functional_test.go:255: (dbg) Run:  docker tag kicbase/echo-server:latest kicbase/echo-server:functional-125117
functional_test.go:260: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 image load --daemon kicbase/echo-server:functional-125117 --alsologtostderr
functional_test.go:260: (dbg) Done: out/minikube-linux-arm64 -p functional-125117 image load --daemon kicbase/echo-server:functional-125117 --alsologtostderr: (1.344487879s)
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon (1.85s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageSaveToFile (0.57s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageSaveToFile
functional_test.go:395: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 image save kicbase/echo-server:functional-125117 /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr
--- PASS: TestFunctional/parallel/ImageCommands/ImageSaveToFile (0.57s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageRemove (0.65s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageRemove
functional_test.go:407: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 image rm kicbase/echo-server:functional-125117 --alsologtostderr
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageRemove (0.65s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageLoadFromFile (0.8s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageLoadFromFile
functional_test.go:424: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 image load /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageLoadFromFile (0.80s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageSaveDaemon (0.54s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageSaveDaemon
functional_test.go:434: (dbg) Run:  docker rmi kicbase/echo-server:functional-125117
functional_test.go:439: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 image save --daemon kicbase/echo-server:functional-125117 --alsologtostderr
functional_test.go:447: (dbg) Run:  docker image inspect kicbase/echo-server:functional-125117
--- PASS: TestFunctional/parallel/ImageCommands/ImageSaveDaemon (0.54s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_changes (0.19s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_changes
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_changes

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_changes
functional_test.go:2124: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_changes (0.19s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster (0.15s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster
functional_test.go:2124: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster (0.15s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_clusters (0.18s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_clusters
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_clusters

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_clusters
functional_test.go:2124: (dbg) Run:  out/minikube-linux-arm64 -p functional-125117 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_clusters (0.18s)

                                                
                                    
x
+
TestFunctional/delete_echo-server_images (0.04s)

                                                
                                                
=== RUN   TestFunctional/delete_echo-server_images
functional_test.go:205: (dbg) Run:  docker rmi -f kicbase/echo-server:1.0
functional_test.go:205: (dbg) Run:  docker rmi -f kicbase/echo-server:functional-125117
--- PASS: TestFunctional/delete_echo-server_images (0.04s)

                                                
                                    
x
+
TestFunctional/delete_my-image_image (0.02s)

                                                
                                                
=== RUN   TestFunctional/delete_my-image_image
functional_test.go:213: (dbg) Run:  docker rmi -f localhost/my-image:functional-125117
--- PASS: TestFunctional/delete_my-image_image (0.02s)

                                                
                                    
x
+
TestFunctional/delete_minikube_cached_images (0.02s)

                                                
                                                
=== RUN   TestFunctional/delete_minikube_cached_images
functional_test.go:221: (dbg) Run:  docker rmi -f minikube-local-cache-test:functional-125117
--- PASS: TestFunctional/delete_minikube_cached_images (0.02s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CopySyncFile (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CopySyncFile
functional_test.go:1860: local sync path: /home/jenkins/minikube-integration/22230-1998525/.minikube/files/etc/test/nested/copy/2000386/hosts
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CopySyncFile (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/AuditLog (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/AuditLog
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/AuditLog (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/KubeContext (0.06s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/KubeContext
functional_test.go:696: (dbg) Run:  kubectl config current-context
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/KubeContext (0.06s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/add_remote (3.35s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/add_remote
functional_test.go:1064: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 cache add registry.k8s.io/pause:3.1
functional_test.go:1064: (dbg) Done: out/minikube-linux-arm64 -p functional-006924 cache add registry.k8s.io/pause:3.1: (1.144651792s)
functional_test.go:1064: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 cache add registry.k8s.io/pause:3.3
functional_test.go:1064: (dbg) Done: out/minikube-linux-arm64 -p functional-006924 cache add registry.k8s.io/pause:3.3: (1.140159203s)
functional_test.go:1064: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 cache add registry.k8s.io/pause:latest
functional_test.go:1064: (dbg) Done: out/minikube-linux-arm64 -p functional-006924 cache add registry.k8s.io/pause:latest: (1.060071228s)
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/add_remote (3.35s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/add_local (1.04s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/add_local
functional_test.go:1092: (dbg) Run:  docker build -t minikube-local-cache-test:functional-006924 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1serialCacheC4223254593/001
functional_test.go:1104: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 cache add minikube-local-cache-test:functional-006924
functional_test.go:1109: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 cache delete minikube-local-cache-test:functional-006924
functional_test.go:1098: (dbg) Run:  docker rmi minikube-local-cache-test:functional-006924
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/add_local (1.04s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/CacheDelete (0.06s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/CacheDelete
functional_test.go:1117: (dbg) Run:  out/minikube-linux-arm64 cache delete registry.k8s.io/pause:3.3
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/CacheDelete (0.06s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/list (0.07s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/list
functional_test.go:1125: (dbg) Run:  out/minikube-linux-arm64 cache list
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/list (0.07s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/verify_cache_inside_node (0.32s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/verify_cache_inside_node
functional_test.go:1139: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 ssh sudo crictl images
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/verify_cache_inside_node (0.32s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/cache_reload (1.93s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/cache_reload
functional_test.go:1162: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 ssh sudo crictl rmi registry.k8s.io/pause:latest
functional_test.go:1168: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 ssh sudo crictl inspecti registry.k8s.io/pause:latest
functional_test.go:1168: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-006924 ssh sudo crictl inspecti registry.k8s.io/pause:latest: exit status 1 (321.273547ms)

                                                
                                                
-- stdout --
	FATA[0000] no such image "registry.k8s.io/pause:latest" present 

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test.go:1173: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 cache reload
functional_test.go:1178: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 ssh sudo crictl inspecti registry.k8s.io/pause:latest
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/cache_reload (1.93s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/delete (0.13s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/delete
functional_test.go:1187: (dbg) Run:  out/minikube-linux-arm64 cache delete registry.k8s.io/pause:3.1
functional_test.go:1187: (dbg) Run:  out/minikube-linux-arm64 cache delete registry.k8s.io/pause:latest
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/CacheCmd/cache/delete (0.13s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/LogsCmd (0.95s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/LogsCmd
functional_test.go:1251: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 logs
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/LogsCmd (0.95s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/LogsFileCmd (0.97s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/LogsFileCmd
functional_test.go:1265: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 logs --file /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1serialLogsFi1972005640/001/logs.txt
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/serial/LogsFileCmd (0.97s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ConfigCmd (0.45s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ConfigCmd
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ConfigCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ConfigCmd
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 config unset cpus
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 config get cpus
functional_test.go:1214: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-006924 config get cpus: exit status 14 (87.738221ms)

                                                
                                                
** stderr ** 
	Error: specified key could not be found in config

                                                
                                                
** /stderr **
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 config set cpus 2
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 config get cpus
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 config unset cpus
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 config get cpus
functional_test.go:1214: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-006924 config get cpus: exit status 14 (55.399563ms)

                                                
                                                
** stderr ** 
	Error: specified key could not be found in config

                                                
                                                
** /stderr **
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ConfigCmd (0.45s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DryRun (0.46s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DryRun
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DryRun

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DryRun
functional_test.go:989: (dbg) Run:  out/minikube-linux-arm64 start -p functional-006924 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-rc.1
functional_test.go:989: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p functional-006924 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-rc.1: exit status 23 (192.303445ms)

                                                
                                                
-- stdout --
	* [functional-006924] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=22230
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/22230-1998525/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/22230-1998525/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the docker driver based on existing profile
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1219 06:26:29.462412 2082064 out.go:360] Setting OutFile to fd 1 ...
	I1219 06:26:29.462704 2082064 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 06:26:29.462720 2082064 out.go:374] Setting ErrFile to fd 2...
	I1219 06:26:29.462727 2082064 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 06:26:29.463019 2082064 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22230-1998525/.minikube/bin
	I1219 06:26:29.463459 2082064 out.go:368] Setting JSON to false
	I1219 06:26:29.464461 2082064 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":40136,"bootTime":1766085454,"procs":157,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1219 06:26:29.464534 2082064 start.go:143] virtualization:  
	I1219 06:26:29.467920 2082064 out.go:179] * [functional-006924] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1219 06:26:29.471913 2082064 notify.go:221] Checking for updates...
	I1219 06:26:29.475697 2082064 out.go:179]   - MINIKUBE_LOCATION=22230
	I1219 06:26:29.478626 2082064 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1219 06:26:29.481567 2082064 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22230-1998525/kubeconfig
	I1219 06:26:29.484618 2082064 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22230-1998525/.minikube
	I1219 06:26:29.487485 2082064 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1219 06:26:29.490426 2082064 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1219 06:26:29.493893 2082064 config.go:182] Loaded profile config "functional-006924": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1219 06:26:29.494477 2082064 driver.go:422] Setting default libvirt URI to qemu:///system
	I1219 06:26:29.525119 2082064 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1219 06:26:29.525224 2082064 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1219 06:26:29.580878 2082064 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-19 06:26:29.571698933 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1219 06:26:29.580989 2082064 docker.go:319] overlay module found
	I1219 06:26:29.584263 2082064 out.go:179] * Using the docker driver based on existing profile
	I1219 06:26:29.587138 2082064 start.go:309] selected driver: docker
	I1219 06:26:29.587162 2082064 start.go:928] validating driver "docker" against &{Name:functional-006924 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APISer
verName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUI
D:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1219 06:26:29.587273 2082064 start.go:939] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1219 06:26:29.590805 2082064 out.go:203] 
	W1219 06:26:29.593659 2082064 out.go:285] X Exiting due to RSRC_INSUFFICIENT_REQ_MEMORY: Requested memory allocation 250MiB is less than the usable minimum of 1800MB
	X Exiting due to RSRC_INSUFFICIENT_REQ_MEMORY: Requested memory allocation 250MiB is less than the usable minimum of 1800MB
	I1219 06:26:29.596549 2082064 out.go:203] 

                                                
                                                
** /stderr **
functional_test.go:1006: (dbg) Run:  out/minikube-linux-arm64 start -p functional-006924 --dry-run --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-rc.1
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DryRun (0.46s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/InternationalLanguage (0.18s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/InternationalLanguage
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/InternationalLanguage

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/InternationalLanguage
functional_test.go:1035: (dbg) Run:  out/minikube-linux-arm64 start -p functional-006924 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-rc.1
E1219 06:26:29.406106 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-125117/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
functional_test.go:1035: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p functional-006924 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-rc.1: exit status 23 (181.47956ms)

                                                
                                                
-- stdout --
	* [functional-006924] minikube v1.37.0 sur Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=22230
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/22230-1998525/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/22230-1998525/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Utilisation du pilote docker basé sur le profil existant
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1219 06:26:29.277438 2082017 out.go:360] Setting OutFile to fd 1 ...
	I1219 06:26:29.277624 2082017 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 06:26:29.277661 2082017 out.go:374] Setting ErrFile to fd 2...
	I1219 06:26:29.277674 2082017 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 06:26:29.278047 2082017 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22230-1998525/.minikube/bin
	I1219 06:26:29.278468 2082017 out.go:368] Setting JSON to false
	I1219 06:26:29.279390 2082017 start.go:133] hostinfo: {"hostname":"ip-172-31-30-239","uptime":40136,"bootTime":1766085454,"procs":157,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"92f46a7d-c249-4c12-924a-77f64874c910"}
	I1219 06:26:29.279460 2082017 start.go:143] virtualization:  
	I1219 06:26:29.282957 2082017 out.go:179] * [functional-006924] minikube v1.37.0 sur Ubuntu 20.04 (arm64)
	I1219 06:26:29.285954 2082017 out.go:179]   - MINIKUBE_LOCATION=22230
	I1219 06:26:29.286026 2082017 notify.go:221] Checking for updates...
	I1219 06:26:29.291938 2082017 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1219 06:26:29.294799 2082017 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22230-1998525/kubeconfig
	I1219 06:26:29.297695 2082017 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22230-1998525/.minikube
	I1219 06:26:29.300636 2082017 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1219 06:26:29.303652 2082017 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1219 06:26:29.307058 2082017 config.go:182] Loaded profile config "functional-006924": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
	I1219 06:26:29.307673 2082017 driver.go:422] Setting default libvirt URI to qemu:///system
	I1219 06:26:29.331895 2082017 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1219 06:26:29.332011 2082017 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1219 06:26:29.387522 2082017 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-19 06:26:29.377686736 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1219 06:26:29.387628 2082017 docker.go:319] overlay module found
	I1219 06:26:29.390777 2082017 out.go:179] * Utilisation du pilote docker basé sur le profil existant
	I1219 06:26:29.393557 2082017 start.go:309] selected driver: docker
	I1219 06:26:29.393578 2082017 start.go:928] validating driver "docker" against &{Name:functional-006924 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1765966054-22186@sha256:1c173489767e6632c410d2554f1a2272f032a423dd528157e201daadfe3c43f0 Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-rc.1 ClusterName:functional-006924 Namespace:default APIServerHAVIP: APISer
verName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-rc.1 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUI
D:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1219 06:26:29.393693 2082017 start.go:939] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1219 06:26:29.397113 2082017 out.go:203] 
	W1219 06:26:29.399986 2082017 out.go:285] X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	I1219 06:26:29.402795 2082017 out.go:203] 

                                                
                                                
** /stderr **
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/InternationalLanguage (0.18s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/AddonsCmd (0.16s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/AddonsCmd
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/AddonsCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/AddonsCmd
functional_test.go:1695: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 addons list
functional_test.go:1707: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 addons list -o json
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/AddonsCmd (0.16s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/SSHCmd (0.73s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/SSHCmd
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/SSHCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/SSHCmd
functional_test.go:1730: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 ssh "echo hello"
functional_test.go:1747: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 ssh "cat /etc/hostname"
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/SSHCmd (0.73s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/CpCmd (2.21s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/CpCmd
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/CpCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/CpCmd
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 cp testdata/cp-test.txt /home/docker/cp-test.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 ssh -n functional-006924 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 cp functional-006924:/home/docker/cp-test.txt /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelCpCm1537732802/001/cp-test.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 ssh -n functional-006924 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 cp testdata/cp-test.txt /tmp/does/not/exist/cp-test.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 ssh -n functional-006924 "sudo cat /tmp/does/not/exist/cp-test.txt"
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/CpCmd (2.21s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/FileSync (0.27s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/FileSync
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/FileSync

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/FileSync
functional_test.go:1934: Checking for existence of /etc/test/nested/copy/2000386/hosts within VM
functional_test.go:1936: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 ssh "sudo cat /etc/test/nested/copy/2000386/hosts"
functional_test.go:1941: file sync test content: Test file for checking file sync process
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/FileSync (0.27s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/CertSync (1.71s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/CertSync
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/CertSync

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/CertSync
functional_test.go:1977: Checking for existence of /etc/ssl/certs/2000386.pem within VM
functional_test.go:1978: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 ssh "sudo cat /etc/ssl/certs/2000386.pem"
functional_test.go:1977: Checking for existence of /usr/share/ca-certificates/2000386.pem within VM
functional_test.go:1978: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 ssh "sudo cat /usr/share/ca-certificates/2000386.pem"
functional_test.go:1977: Checking for existence of /etc/ssl/certs/51391683.0 within VM
functional_test.go:1978: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 ssh "sudo cat /etc/ssl/certs/51391683.0"
functional_test.go:2004: Checking for existence of /etc/ssl/certs/20003862.pem within VM
functional_test.go:2005: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 ssh "sudo cat /etc/ssl/certs/20003862.pem"
functional_test.go:2004: Checking for existence of /usr/share/ca-certificates/20003862.pem within VM
functional_test.go:2005: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 ssh "sudo cat /usr/share/ca-certificates/20003862.pem"
functional_test.go:2004: Checking for existence of /etc/ssl/certs/3ec20f2e.0 within VM
functional_test.go:2005: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 ssh "sudo cat /etc/ssl/certs/3ec20f2e.0"
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/CertSync (1.71s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NonActiveRuntimeDisabled (0.58s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NonActiveRuntimeDisabled
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NonActiveRuntimeDisabled

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NonActiveRuntimeDisabled
functional_test.go:2032: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 ssh "sudo systemctl is-active docker"
functional_test.go:2032: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-006924 ssh "sudo systemctl is-active docker": exit status 1 (300.48512ms)

                                                
                                                
-- stdout --
	inactive

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
functional_test.go:2032: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 ssh "sudo systemctl is-active crio"
functional_test.go:2032: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-006924 ssh "sudo systemctl is-active crio": exit status 1 (279.92395ms)

                                                
                                                
-- stdout --
	inactive

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/NonActiveRuntimeDisabled (0.58s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/License (0.24s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/License
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/License

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/License
functional_test.go:2293: (dbg) Run:  out/minikube-linux-arm64 license
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/License (0.24s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/StartTunnel (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/StartTunnel
functional_test_tunnel_test.go:129: (dbg) daemon: [out/minikube-linux-arm64 -p functional-006924 tunnel --alsologtostderr]
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/StartTunnel (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/DeleteTunnel (0.1s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/DeleteTunnel
functional_test_tunnel_test.go:434: (dbg) stopping [out/minikube-linux-arm64 -p functional-006924 tunnel --alsologtostderr] ...
functional_test_tunnel_test.go:437: failed to stop process: exit status 103
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/DeleteTunnel (0.10s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ProfileCmd/profile_not_create (0.43s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ProfileCmd/profile_not_create
functional_test.go:1285: (dbg) Run:  out/minikube-linux-arm64 profile lis
functional_test.go:1290: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ProfileCmd/profile_not_create (0.43s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ProfileCmd/profile_list (0.4s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ProfileCmd/profile_list
functional_test.go:1325: (dbg) Run:  out/minikube-linux-arm64 profile list
functional_test.go:1330: Took "342.113411ms" to run "out/minikube-linux-arm64 profile list"
functional_test.go:1339: (dbg) Run:  out/minikube-linux-arm64 profile list -l
functional_test.go:1344: Took "54.448946ms" to run "out/minikube-linux-arm64 profile list -l"
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ProfileCmd/profile_list (0.40s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ProfileCmd/profile_json_output (0.41s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ProfileCmd/profile_json_output
functional_test.go:1376: (dbg) Run:  out/minikube-linux-arm64 profile list -o json
functional_test.go:1381: Took "354.68842ms" to run "out/minikube-linux-arm64 profile list -o json"
functional_test.go:1389: (dbg) Run:  out/minikube-linux-arm64 profile list -o json --light
functional_test.go:1394: Took "57.472051ms" to run "out/minikube-linux-arm64 profile list -o json --light"
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ProfileCmd/profile_json_output (0.41s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MountCmd/specific-port (2.07s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MountCmd/specific-port
functional_test_mount_test.go:219: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-006924 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun1569475337/001:/mount-9p --alsologtostderr -v=1 --port 33725]
functional_test_mount_test.go:249: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:249: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-006924 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (351.935848ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
I1219 06:26:22.269281 2000386 retry.go:31] will retry after 622.223256ms: exit status 1
functional_test_mount_test.go:249: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:263: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 ssh -- ls -la /mount-9p
functional_test_mount_test.go:267: guest mount directory contents
total 0
functional_test_mount_test.go:269: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-006924 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun1569475337/001:/mount-9p --alsologtostderr -v=1 --port 33725] ...
functional_test_mount_test.go:270: reading mount text
functional_test_mount_test.go:284: done reading mount text
functional_test_mount_test.go:236: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 ssh "sudo umount -f /mount-9p"
functional_test_mount_test.go:236: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-006924 ssh "sudo umount -f /mount-9p": exit status 1 (305.055385ms)

                                                
                                                
-- stdout --
	umount: /mount-9p: not mounted.

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 32

                                                
                                                
** /stderr **
functional_test_mount_test.go:238: "out/minikube-linux-arm64 -p functional-006924 ssh \"sudo umount -f /mount-9p\"": exit status 1
functional_test_mount_test.go:240: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-006924 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun1569475337/001:/mount-9p --alsologtostderr -v=1 --port 33725] ...
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MountCmd/specific-port (2.07s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MountCmd/VerifyCleanup (2.13s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MountCmd/VerifyCleanup
functional_test_mount_test.go:304: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-006924 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun2343003325/001:/mount1 --alsologtostderr -v=1]
functional_test_mount_test.go:304: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-006924 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun2343003325/001:/mount2 --alsologtostderr -v=1]
functional_test_mount_test.go:304: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-006924 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun2343003325/001:/mount3 --alsologtostderr -v=1]
functional_test_mount_test.go:331: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 ssh "findmnt -T" /mount1
functional_test_mount_test.go:331: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-006924 ssh "findmnt -T" /mount1: exit status 1 (604.724857ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
I1219 06:26:24.595654 2000386 retry.go:31] will retry after 638.583171ms: exit status 1
functional_test_mount_test.go:331: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 ssh "findmnt -T" /mount1
functional_test_mount_test.go:331: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 ssh "findmnt -T" /mount2
functional_test_mount_test.go:331: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 ssh "findmnt -T" /mount3
functional_test_mount_test.go:376: (dbg) Run:  out/minikube-linux-arm64 mount -p functional-006924 --kill=true
functional_test_mount_test.go:319: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-006924 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun2343003325/001:/mount1 --alsologtostderr -v=1] ...
helpers_test.go:508: unable to find parent, assuming dead: process does not exist
functional_test_mount_test.go:319: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-006924 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun2343003325/001:/mount2 --alsologtostderr -v=1] ...
helpers_test.go:508: unable to find parent, assuming dead: process does not exist
functional_test_mount_test.go:319: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-006924 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-rc.1parallelMoun2343003325/001:/mount3 --alsologtostderr -v=1] ...
helpers_test.go:508: unable to find parent, assuming dead: process does not exist
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MountCmd/VerifyCleanup (2.13s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/Version/short (0.05s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/Version/short
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/Version/short

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/Version/short
functional_test.go:2261: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 version --short
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/Version/short (0.05s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/Version/components (0.54s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/Version/components
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/Version/components

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/Version/components
functional_test.go:2275: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 version -o=json --components
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/Version/components (0.54s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListShort (0.25s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListShort
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListShort

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListShort
functional_test.go:276: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 image ls --format short --alsologtostderr
functional_test.go:281: (dbg) Stdout: out/minikube-linux-arm64 -p functional-006924 image ls --format short --alsologtostderr:
registry.k8s.io/pause:latest
registry.k8s.io/pause:3.3
registry.k8s.io/pause:3.10.1
registry.k8s.io/pause:3.1
registry.k8s.io/kube-scheduler:v1.35.0-rc.1
registry.k8s.io/kube-proxy:v1.35.0-rc.1
registry.k8s.io/kube-controller-manager:v1.35.0-rc.1
registry.k8s.io/kube-apiserver:v1.35.0-rc.1
registry.k8s.io/etcd:3.6.6-0
registry.k8s.io/coredns/coredns:v1.13.1
gcr.io/k8s-minikube/storage-provisioner:v5
docker.io/library/minikube-local-cache-test:functional-006924
docker.io/kindest/kindnetd:v20250512-df8de77b
docker.io/kicbase/echo-server:functional-006924
functional_test.go:284: (dbg) Stderr: out/minikube-linux-arm64 -p functional-006924 image ls --format short --alsologtostderr:
I1219 06:26:42.217364 2084240 out.go:360] Setting OutFile to fd 1 ...
I1219 06:26:42.217538 2084240 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1219 06:26:42.217553 2084240 out.go:374] Setting ErrFile to fd 2...
I1219 06:26:42.217560 2084240 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1219 06:26:42.217875 2084240 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22230-1998525/.minikube/bin
I1219 06:26:42.218759 2084240 config.go:182] Loaded profile config "functional-006924": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
I1219 06:26:42.218911 2084240 config.go:182] Loaded profile config "functional-006924": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
I1219 06:26:42.219498 2084240 cli_runner.go:164] Run: docker container inspect functional-006924 --format={{.State.Status}}
I1219 06:26:42.239285 2084240 ssh_runner.go:195] Run: systemctl --version
I1219 06:26:42.239353 2084240 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
I1219 06:26:42.259137 2084240 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
I1219 06:26:42.368224 2084240 ssh_runner.go:195] Run: sudo crictl --timeout=10s images --output json
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListShort (0.25s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListTable (0.23s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListTable
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListTable

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListTable
functional_test.go:276: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 image ls --format table --alsologtostderr
functional_test.go:281: (dbg) Stdout: out/minikube-linux-arm64 -p functional-006924 image ls --format table --alsologtostderr:
┌─────────────────────────────────────────────┬────────────────────┬───────────────┬────────┐
│                    IMAGE                    │        TAG         │   IMAGE ID    │  SIZE  │
├─────────────────────────────────────────────┼────────────────────┼───────────────┼────────┤
│ registry.k8s.io/kube-scheduler              │ v1.35.0-rc.1       │ sha256:abca4d │ 15.4MB │
│ registry.k8s.io/pause                       │ 3.3                │ sha256:3d1873 │ 249kB  │
│ docker.io/library/minikube-local-cache-test │ functional-006924  │ sha256:51d149 │ 992B   │
│ localhost/my-image                          │ functional-006924  │ sha256:6505c5 │ 831kB  │
│ registry.k8s.io/coredns/coredns             │ v1.13.1            │ sha256:e08f4d │ 21.2MB │
│ registry.k8s.io/pause                       │ 3.1                │ sha256:8057e0 │ 262kB  │
│ registry.k8s.io/pause                       │ latest             │ sha256:8cb209 │ 71.3kB │
│ docker.io/kicbase/echo-server               │ functional-006924  │ sha256:ce2d2c │ 2.17MB │
│ registry.k8s.io/kube-controller-manager     │ v1.35.0-rc.1       │ sha256:a34b34 │ 20.7MB │
│ registry.k8s.io/pause                       │ 3.10.1             │ sha256:d7b100 │ 268kB  │
│ docker.io/kindest/kindnetd                  │ v20250512-df8de77b │ sha256:b1a8c6 │ 40.6MB │
│ registry.k8s.io/etcd                        │ 3.6.6-0            │ sha256:271e49 │ 21.7MB │
│ registry.k8s.io/kube-apiserver              │ v1.35.0-rc.1       │ sha256:3c6ba2 │ 24.7MB │
│ registry.k8s.io/kube-proxy                  │ v1.35.0-rc.1       │ sha256:7e3ace │ 22.4MB │
│ gcr.io/k8s-minikube/storage-provisioner     │ v5                 │ sha256:ba04bb │ 8.03MB │
└─────────────────────────────────────────────┴────────────────────┴───────────────┴────────┘
functional_test.go:284: (dbg) Stderr: out/minikube-linux-arm64 -p functional-006924 image ls --format table --alsologtostderr:
I1219 06:26:46.383794 2084642 out.go:360] Setting OutFile to fd 1 ...
I1219 06:26:46.383955 2084642 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1219 06:26:46.383983 2084642 out.go:374] Setting ErrFile to fd 2...
I1219 06:26:46.384000 2084642 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1219 06:26:46.384269 2084642 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22230-1998525/.minikube/bin
I1219 06:26:46.384957 2084642 config.go:182] Loaded profile config "functional-006924": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
I1219 06:26:46.385113 2084642 config.go:182] Loaded profile config "functional-006924": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
I1219 06:26:46.385695 2084642 cli_runner.go:164] Run: docker container inspect functional-006924 --format={{.State.Status}}
I1219 06:26:46.402714 2084642 ssh_runner.go:195] Run: systemctl --version
I1219 06:26:46.402767 2084642 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
I1219 06:26:46.419625 2084642 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
I1219 06:26:46.527604 2084642 ssh_runner.go:195] Run: sudo crictl --timeout=10s images --output json
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListTable (0.23s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListJson (0.25s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListJson
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListJson

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListJson
functional_test.go:276: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 image ls --format json --alsologtostderr
functional_test.go:281: (dbg) Stdout: out/minikube-linux-arm64 -p functional-006924 image ls --format json --alsologtostderr:
[{"id":"sha256:6505c5ca2c8ea31d3c7a79e95c6328d648b53a66c622dff2fd9f8335dbe3084d","repoDigests":[],"repoTags":["localhost/my-image:functional-006924"],"size":"830601"},{"id":"sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf","repoDigests":["registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6"],"repoTags":["registry.k8s.io/coredns/coredns:v1.13.1"],"size":"21168808"},{"id":"sha256:8057e0500773a37cde2cff041eb13ebd68c748419a2fbfd1dfb5bf38696cc8e5","repoDigests":[],"repoTags":["registry.k8s.io/pause:3.1"],"size":"262191"},{"id":"sha256:b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c","repoDigests":["docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a"],"repoTags":["docker.io/kindest/kindnetd:v20250512-df8de77b"],"size":"40636774"},{"id":"sha256:3c6ba27e07aef16adb050828695bfe6206439147b9ade2a2a1777c276bf79a54","repoDigests":["registry.k8s.io/kube-apiserver@sha256:58367b5
c0428495c0c12411fa7a018f5d40fe57307b85d8935b1ed35706ff7ee"],"repoTags":["registry.k8s.io/kube-apiserver:v1.35.0-rc.1"],"size":"24692223"},{"id":"sha256:a34b3483f25ba81aa72f3aeb607a8c756479e8497d8420acbcd2854162ebf84a","repoDigests":["registry.k8s.io/kube-controller-manager@sha256:57ab0f75f58d99f4be7bff7bdda015fcbf1b7c20e58ba2722c8c39f751dc8c98"],"repoTags":["registry.k8s.io/kube-controller-manager:v1.35.0-rc.1"],"size":"20672157"},{"id":"sha256:51d14939a1995a88415fccb269ec40dc043aefbcf5035f79ba02097bb3909863","repoDigests":[],"repoTags":["docker.io/library/minikube-local-cache-test:functional-006924"],"size":"992"},{"id":"sha256:abca4d5226620be2218c3971464a1066651a743008c1db8720353446a4b7bbde","repoDigests":["registry.k8s.io/kube-scheduler@sha256:8155e3db27c7081abfc8eb5da70820cfeaf0bba7449e45360e8220e670f417d3"],"repoTags":["registry.k8s.io/kube-scheduler:v1.35.0-rc.1"],"size":"15405535"},{"id":"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd","repoDigests":["registry.k8s.io/pause@sha2
56:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c"],"repoTags":["registry.k8s.io/pause:3.10.1"],"size":"267939"},{"id":"sha256:271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57","repoDigests":["registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890"],"repoTags":["registry.k8s.io/etcd:3.6.6-0"],"size":"21749640"},{"id":"sha256:7e3acea3d87aa7ca234514e7f9c10450c7a7f87fc273fc9b5a220e2a2be1ce4e","repoDigests":["registry.k8s.io/kube-proxy@sha256:bdd1fa8b53558a2e1967379a36b085c93faf15581e5fa9f212baf679d89c5bb5"],"repoTags":["registry.k8s.io/kube-proxy:v1.35.0-rc.1"],"size":"22432301"},{"id":"sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a","repoDigests":[],"repoTags":["registry.k8s.io/pause:latest"],"size":"71300"},{"id":"sha256:3d18732f8686cc3c878055d99a05fa80289502fa496b36b6a0fe0f77206a7300","repoDigests":[],"repoTags":["registry.k8s.io/pause:3.3"],"size":"249461"},{"id":"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f
230b79fdca74cc91729d17","repoDigests":[],"repoTags":["docker.io/kicbase/echo-server:functional-006924"],"size":"2173567"},{"id":"sha256:ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6","repoDigests":["gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"],"repoTags":["gcr.io/k8s-minikube/storage-provisioner:v5"],"size":"8034419"}]
functional_test.go:284: (dbg) Stderr: out/minikube-linux-arm64 -p functional-006924 image ls --format json --alsologtostderr:
I1219 06:26:46.136836 2084599 out.go:360] Setting OutFile to fd 1 ...
I1219 06:26:46.136949 2084599 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1219 06:26:46.136960 2084599 out.go:374] Setting ErrFile to fd 2...
I1219 06:26:46.136965 2084599 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1219 06:26:46.137459 2084599 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22230-1998525/.minikube/bin
I1219 06:26:46.138133 2084599 config.go:182] Loaded profile config "functional-006924": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
I1219 06:26:46.138255 2084599 config.go:182] Loaded profile config "functional-006924": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
I1219 06:26:46.138773 2084599 cli_runner.go:164] Run: docker container inspect functional-006924 --format={{.State.Status}}
I1219 06:26:46.165437 2084599 ssh_runner.go:195] Run: systemctl --version
I1219 06:26:46.165494 2084599 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
I1219 06:26:46.192808 2084599 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
I1219 06:26:46.303658 2084599 ssh_runner.go:195] Run: sudo crictl --timeout=10s images --output json
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListJson (0.25s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListYaml (0.25s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListYaml
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListYaml

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListYaml
functional_test.go:276: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 image ls --format yaml --alsologtostderr
functional_test.go:281: (dbg) Stdout: out/minikube-linux-arm64 -p functional-006924 image ls --format yaml --alsologtostderr:
- id: sha256:a34b3483f25ba81aa72f3aeb607a8c756479e8497d8420acbcd2854162ebf84a
repoDigests:
- registry.k8s.io/kube-controller-manager@sha256:57ab0f75f58d99f4be7bff7bdda015fcbf1b7c20e58ba2722c8c39f751dc8c98
repoTags:
- registry.k8s.io/kube-controller-manager:v1.35.0-rc.1
size: "20672157"
- id: sha256:8057e0500773a37cde2cff041eb13ebd68c748419a2fbfd1dfb5bf38696cc8e5
repoDigests: []
repoTags:
- registry.k8s.io/pause:3.1
size: "262191"
- id: sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd
repoDigests:
- registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c
repoTags:
- registry.k8s.io/pause:3.10.1
size: "267939"
- id: sha256:3d18732f8686cc3c878055d99a05fa80289502fa496b36b6a0fe0f77206a7300
repoDigests: []
repoTags:
- registry.k8s.io/pause:3.3
size: "249461"
- id: sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a
repoDigests: []
repoTags:
- registry.k8s.io/pause:latest
size: "71300"
- id: sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17
repoDigests: []
repoTags:
- docker.io/kicbase/echo-server:functional-006924
size: "2173567"
- id: sha256:b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c
repoDigests:
- docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a
repoTags:
- docker.io/kindest/kindnetd:v20250512-df8de77b
size: "40636774"
- id: sha256:51d14939a1995a88415fccb269ec40dc043aefbcf5035f79ba02097bb3909863
repoDigests: []
repoTags:
- docker.io/library/minikube-local-cache-test:functional-006924
size: "992"
- id: sha256:ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6
repoDigests:
- gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944
repoTags:
- gcr.io/k8s-minikube/storage-provisioner:v5
size: "8034419"
- id: sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf
repoDigests:
- registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6
repoTags:
- registry.k8s.io/coredns/coredns:v1.13.1
size: "21168808"
- id: sha256:271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57
repoDigests:
- registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890
repoTags:
- registry.k8s.io/etcd:3.6.6-0
size: "21749640"
- id: sha256:7e3acea3d87aa7ca234514e7f9c10450c7a7f87fc273fc9b5a220e2a2be1ce4e
repoDigests:
- registry.k8s.io/kube-proxy@sha256:bdd1fa8b53558a2e1967379a36b085c93faf15581e5fa9f212baf679d89c5bb5
repoTags:
- registry.k8s.io/kube-proxy:v1.35.0-rc.1
size: "22432301"
- id: sha256:abca4d5226620be2218c3971464a1066651a743008c1db8720353446a4b7bbde
repoDigests:
- registry.k8s.io/kube-scheduler@sha256:8155e3db27c7081abfc8eb5da70820cfeaf0bba7449e45360e8220e670f417d3
repoTags:
- registry.k8s.io/kube-scheduler:v1.35.0-rc.1
size: "15405535"
- id: sha256:3c6ba27e07aef16adb050828695bfe6206439147b9ade2a2a1777c276bf79a54
repoDigests:
- registry.k8s.io/kube-apiserver@sha256:58367b5c0428495c0c12411fa7a018f5d40fe57307b85d8935b1ed35706ff7ee
repoTags:
- registry.k8s.io/kube-apiserver:v1.35.0-rc.1
size: "24692223"

                                                
                                                
functional_test.go:284: (dbg) Stderr: out/minikube-linux-arm64 -p functional-006924 image ls --format yaml --alsologtostderr:
I1219 06:26:42.467315 2084284 out.go:360] Setting OutFile to fd 1 ...
I1219 06:26:42.467679 2084284 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1219 06:26:42.467688 2084284 out.go:374] Setting ErrFile to fd 2...
I1219 06:26:42.467694 2084284 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1219 06:26:42.467937 2084284 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22230-1998525/.minikube/bin
I1219 06:26:42.468543 2084284 config.go:182] Loaded profile config "functional-006924": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
I1219 06:26:42.468658 2084284 config.go:182] Loaded profile config "functional-006924": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
I1219 06:26:42.469199 2084284 cli_runner.go:164] Run: docker container inspect functional-006924 --format={{.State.Status}}
I1219 06:26:42.489451 2084284 ssh_runner.go:195] Run: systemctl --version
I1219 06:26:42.489524 2084284 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
I1219 06:26:42.509109 2084284 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
I1219 06:26:42.615250 2084284 ssh_runner.go:195] Run: sudo crictl --timeout=10s images --output json
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageListYaml (0.25s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageBuild (3.44s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageBuild
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageBuild

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageBuild
functional_test.go:323: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 ssh pgrep buildkitd
functional_test.go:323: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-006924 ssh pgrep buildkitd: exit status 1 (266.052763ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test.go:330: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 image build -t localhost/my-image:functional-006924 testdata/build --alsologtostderr
functional_test.go:330: (dbg) Done: out/minikube-linux-arm64 -p functional-006924 image build -t localhost/my-image:functional-006924 testdata/build --alsologtostderr: (2.942048916s)
functional_test.go:338: (dbg) Stderr: out/minikube-linux-arm64 -p functional-006924 image build -t localhost/my-image:functional-006924 testdata/build --alsologtostderr:
I1219 06:26:42.966022 2084384 out.go:360] Setting OutFile to fd 1 ...
I1219 06:26:42.966307 2084384 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1219 06:26:42.966340 2084384 out.go:374] Setting ErrFile to fd 2...
I1219 06:26:42.966363 2084384 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1219 06:26:42.967210 2084384 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22230-1998525/.minikube/bin
I1219 06:26:42.968083 2084384 config.go:182] Loaded profile config "functional-006924": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
I1219 06:26:42.969060 2084384 config.go:182] Loaded profile config "functional-006924": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-rc.1
I1219 06:26:42.969908 2084384 cli_runner.go:164] Run: docker container inspect functional-006924 --format={{.State.Status}}
I1219 06:26:42.987904 2084384 ssh_runner.go:195] Run: systemctl --version
I1219 06:26:42.987967 2084384 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-006924
I1219 06:26:43.010154 2084384 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34704 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/functional-006924/id_rsa Username:docker}
I1219 06:26:43.115420 2084384 build_images.go:162] Building image from path: /tmp/build.3263386588.tar
I1219 06:26:43.115495 2084384 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/build
I1219 06:26:43.123385 2084384 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/build/build.3263386588.tar
I1219 06:26:43.127155 2084384 ssh_runner.go:352] existence check for /var/lib/minikube/build/build.3263386588.tar: stat -c "%s %y" /var/lib/minikube/build/build.3263386588.tar: Process exited with status 1
stdout:

                                                
                                                
stderr:
stat: cannot statx '/var/lib/minikube/build/build.3263386588.tar': No such file or directory
I1219 06:26:43.127188 2084384 ssh_runner.go:362] scp /tmp/build.3263386588.tar --> /var/lib/minikube/build/build.3263386588.tar (3072 bytes)
I1219 06:26:43.145400 2084384 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/build/build.3263386588
I1219 06:26:43.153585 2084384 ssh_runner.go:195] Run: sudo tar -C /var/lib/minikube/build/build.3263386588 -xf /var/lib/minikube/build/build.3263386588.tar
I1219 06:26:43.163076 2084384 containerd.go:394] Building image: /var/lib/minikube/build/build.3263386588
I1219 06:26:43.163158 2084384 ssh_runner.go:195] Run: sudo buildctl build --frontend dockerfile.v0 --local context=/var/lib/minikube/build/build.3263386588 --local dockerfile=/var/lib/minikube/build/build.3263386588 --output type=image,name=localhost/my-image:functional-006924
#1 [internal] load build definition from Dockerfile
#1 transferring dockerfile: 97B done
#1 DONE 0.0s

                                                
                                                
#2 [internal] load metadata for gcr.io/k8s-minikube/busybox:latest
#2 DONE 1.4s

                                                
                                                
#3 [internal] load .dockerignore
#3 transferring context: 2B done
#3 DONE 0.0s

                                                
                                                
#4 [internal] load build context
#4 transferring context: 62B done
#4 DONE 0.0s

                                                
                                                
#5 [1/3] FROM gcr.io/k8s-minikube/busybox:latest@sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b
#5 resolve gcr.io/k8s-minikube/busybox:latest@sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b 0.0s done
#5 DONE 0.1s

                                                
                                                
#5 [1/3] FROM gcr.io/k8s-minikube/busybox:latest@sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b
#5 sha256:a01966dde7f8d5ba10b6d87e776c7c8fb5a5f6bfa678874bd28b33b1fc6dba34 0B / 828.50kB 0.2s
#5 sha256:a01966dde7f8d5ba10b6d87e776c7c8fb5a5f6bfa678874bd28b33b1fc6dba34 828.50kB / 828.50kB 0.4s done
#5 extracting sha256:a01966dde7f8d5ba10b6d87e776c7c8fb5a5f6bfa678874bd28b33b1fc6dba34 0.1s done
#5 DONE 0.6s

                                                
                                                
#6 [2/3] RUN true
#6 DONE 0.2s

                                                
                                                
#7 [3/3] ADD content.txt /
#7 DONE 0.0s

                                                
                                                
#8 exporting to image
#8 exporting layers 0.1s done
#8 exporting manifest sha256:c73857c1030e2bc48e22bab2e9bee9e21af990fbb0c2562e3459f825117d427e
#8 exporting manifest sha256:c73857c1030e2bc48e22bab2e9bee9e21af990fbb0c2562e3459f825117d427e 0.0s done
#8 exporting config sha256:6505c5ca2c8ea31d3c7a79e95c6328d648b53a66c622dff2fd9f8335dbe3084d 0.0s done
#8 naming to localhost/my-image:functional-006924 done
#8 DONE 0.2s
I1219 06:26:45.830986 2084384 ssh_runner.go:235] Completed: sudo buildctl build --frontend dockerfile.v0 --local context=/var/lib/minikube/build/build.3263386588 --local dockerfile=/var/lib/minikube/build/build.3263386588 --output type=image,name=localhost/my-image:functional-006924: (2.667796576s)
I1219 06:26:45.831071 2084384 ssh_runner.go:195] Run: sudo rm -rf /var/lib/minikube/build/build.3263386588
I1219 06:26:45.839987 2084384 ssh_runner.go:195] Run: sudo rm -f /var/lib/minikube/build/build.3263386588.tar
I1219 06:26:45.848079 2084384 build_images.go:218] Built localhost/my-image:functional-006924 from /tmp/build.3263386588.tar
I1219 06:26:45.848113 2084384 build_images.go:134] succeeded building to: functional-006924
I1219 06:26:45.848118 2084384 build_images.go:135] failed building to: 
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 image ls
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageBuild (3.44s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/Setup (0.24s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/Setup
functional_test.go:357: (dbg) Run:  docker pull kicbase/echo-server:1.0
functional_test.go:362: (dbg) Run:  docker tag kicbase/echo-server:1.0 kicbase/echo-server:functional-006924
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/Setup (0.24s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageLoadDaemon (1.14s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageLoadDaemon
functional_test.go:370: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 image load --daemon kicbase/echo-server:functional-006924 --alsologtostderr
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 image ls
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageLoadDaemon (1.14s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageReloadDaemon (1.1s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageReloadDaemon
functional_test.go:380: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 image load --daemon kicbase/echo-server:functional-006924 --alsologtostderr
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 image ls
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageReloadDaemon (1.10s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageTagAndLoadDaemon (1.34s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageTagAndLoadDaemon
functional_test.go:250: (dbg) Run:  docker pull kicbase/echo-server:latest
functional_test.go:255: (dbg) Run:  docker tag kicbase/echo-server:latest kicbase/echo-server:functional-006924
functional_test.go:260: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 image load --daemon kicbase/echo-server:functional-006924 --alsologtostderr
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 image ls
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageTagAndLoadDaemon (1.34s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageSaveToFile (0.34s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageSaveToFile
functional_test.go:395: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 image save kicbase/echo-server:functional-006924 /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageSaveToFile (0.34s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageRemove (0.47s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageRemove
functional_test.go:407: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 image rm kicbase/echo-server:functional-006924 --alsologtostderr
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 image ls
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageRemove (0.47s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageLoadFromFile (0.67s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageLoadFromFile
functional_test.go:424: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 image load /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 image ls
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageLoadFromFile (0.67s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageSaveDaemon (0.4s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageSaveDaemon
functional_test.go:434: (dbg) Run:  docker rmi kicbase/echo-server:functional-006924
functional_test.go:439: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 image save --daemon kicbase/echo-server:functional-006924 --alsologtostderr
functional_test.go:447: (dbg) Run:  docker image inspect kicbase/echo-server:functional-006924
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/ImageCommands/ImageSaveDaemon (0.40s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_changes (0.15s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_changes
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_changes

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_changes
functional_test.go:2124: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 update-context --alsologtostderr -v=2
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_changes (0.15s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_minikube_cluster (0.19s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_minikube_cluster
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_minikube_cluster

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_minikube_cluster
functional_test.go:2124: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 update-context --alsologtostderr -v=2
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_minikube_cluster (0.19s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_clusters (0.15s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_clusters
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_clusters

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_clusters
functional_test.go:2124: (dbg) Run:  out/minikube-linux-arm64 -p functional-006924 update-context --alsologtostderr -v=2
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/UpdateContextCmd/no_clusters (0.15s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/delete_echo-server_images (0.04s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/delete_echo-server_images
functional_test.go:205: (dbg) Run:  docker rmi -f kicbase/echo-server:1.0
functional_test.go:205: (dbg) Run:  docker rmi -f kicbase/echo-server:functional-006924
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/delete_echo-server_images (0.04s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/delete_my-image_image (0.02s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/delete_my-image_image
functional_test.go:213: (dbg) Run:  docker rmi -f localhost/my-image:functional-006924
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/delete_my-image_image (0.02s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/delete_minikube_cached_images (0.02s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/delete_minikube_cached_images
functional_test.go:221: (dbg) Run:  docker rmi -f minikube-local-cache-test:functional-006924
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/delete_minikube_cached_images (0.02s)

                                                
                                    
x
+
TestMultiControlPlane/serial/StartCluster (137.71s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/StartCluster
ha_test.go:101: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 start --ha --memory 3072 --wait true --alsologtostderr -v 5 --driver=docker  --container-runtime=containerd
E1219 06:29:34.250174 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 06:29:34.255394 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 06:29:34.265796 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 06:29:34.286154 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 06:29:34.326693 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 06:29:34.406986 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 06:29:34.567344 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 06:29:34.887544 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 06:29:35.527849 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 06:29:36.808072 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 06:29:39.368241 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 06:29:44.489330 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 06:29:54.729625 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 06:30:15.210516 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 06:30:27.486431 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/addons-064622/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
ha_test.go:101: (dbg) Done: out/minikube-linux-arm64 -p ha-079309 start --ha --memory 3072 --wait true --alsologtostderr -v 5 --driver=docker  --container-runtime=containerd: (2m16.794998803s)
ha_test.go:107: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 status --alsologtostderr -v 5
E1219 06:30:56.170709 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
--- PASS: TestMultiControlPlane/serial/StartCluster (137.71s)

                                                
                                    
x
+
TestMultiControlPlane/serial/DeployApp (7.95s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/DeployApp
ha_test.go:128: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 kubectl -- apply -f ./testdata/ha/ha-pod-dns-test.yaml
ha_test.go:133: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 kubectl -- rollout status deployment/busybox
ha_test.go:133: (dbg) Done: out/minikube-linux-arm64 -p ha-079309 kubectl -- rollout status deployment/busybox: (4.933126342s)
ha_test.go:140: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 kubectl -- get pods -o jsonpath='{.items[*].status.podIP}'
ha_test.go:163: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 kubectl -- get pods -o jsonpath='{.items[*].metadata.name}'
ha_test.go:171: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 kubectl -- exec busybox-7b57f96db7-54q7r -- nslookup kubernetes.io
ha_test.go:171: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 kubectl -- exec busybox-7b57f96db7-c6q8k -- nslookup kubernetes.io
ha_test.go:171: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 kubectl -- exec busybox-7b57f96db7-czll8 -- nslookup kubernetes.io
ha_test.go:181: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 kubectl -- exec busybox-7b57f96db7-54q7r -- nslookup kubernetes.default
ha_test.go:181: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 kubectl -- exec busybox-7b57f96db7-c6q8k -- nslookup kubernetes.default
ha_test.go:181: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 kubectl -- exec busybox-7b57f96db7-czll8 -- nslookup kubernetes.default
ha_test.go:189: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 kubectl -- exec busybox-7b57f96db7-54q7r -- nslookup kubernetes.default.svc.cluster.local
ha_test.go:189: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 kubectl -- exec busybox-7b57f96db7-c6q8k -- nslookup kubernetes.default.svc.cluster.local
ha_test.go:189: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 kubectl -- exec busybox-7b57f96db7-czll8 -- nslookup kubernetes.default.svc.cluster.local
--- PASS: TestMultiControlPlane/serial/DeployApp (7.95s)

                                                
                                    
x
+
TestMultiControlPlane/serial/PingHostFromPods (1.68s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/PingHostFromPods
ha_test.go:199: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 kubectl -- get pods -o jsonpath='{.items[*].metadata.name}'
ha_test.go:207: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 kubectl -- exec busybox-7b57f96db7-54q7r -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
ha_test.go:218: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 kubectl -- exec busybox-7b57f96db7-54q7r -- sh -c "ping -c 1 192.168.49.1"
ha_test.go:207: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 kubectl -- exec busybox-7b57f96db7-c6q8k -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
ha_test.go:218: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 kubectl -- exec busybox-7b57f96db7-c6q8k -- sh -c "ping -c 1 192.168.49.1"
ha_test.go:207: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 kubectl -- exec busybox-7b57f96db7-czll8 -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
ha_test.go:218: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 kubectl -- exec busybox-7b57f96db7-czll8 -- sh -c "ping -c 1 192.168.49.1"
--- PASS: TestMultiControlPlane/serial/PingHostFromPods (1.68s)

                                                
                                    
x
+
TestMultiControlPlane/serial/AddWorkerNode (31.02s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/AddWorkerNode
ha_test.go:228: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 node add --alsologtostderr -v 5
E1219 06:31:29.406659 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-125117/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
ha_test.go:228: (dbg) Done: out/minikube-linux-arm64 -p ha-079309 node add --alsologtostderr -v 5: (29.913700769s)
ha_test.go:234: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 status --alsologtostderr -v 5
ha_test.go:234: (dbg) Done: out/minikube-linux-arm64 -p ha-079309 status --alsologtostderr -v 5: (1.106348028s)
--- PASS: TestMultiControlPlane/serial/AddWorkerNode (31.02s)

                                                
                                    
x
+
TestMultiControlPlane/serial/NodeLabels (0.12s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/NodeLabels
ha_test.go:255: (dbg) Run:  kubectl --context ha-079309 get nodes -o "jsonpath=[{range .items[*]}{.metadata.labels},{end}]"
--- PASS: TestMultiControlPlane/serial/NodeLabels (0.12s)

                                                
                                    
x
+
TestMultiControlPlane/serial/HAppyAfterClusterStart (1.14s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/HAppyAfterClusterStart
ha_test.go:281: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
ha_test.go:281: (dbg) Done: out/minikube-linux-arm64 profile list --output json: (1.141617315s)
--- PASS: TestMultiControlPlane/serial/HAppyAfterClusterStart (1.14s)

                                                
                                    
x
+
TestMultiControlPlane/serial/CopyFile (20.67s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/CopyFile
ha_test.go:328: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 status --output json --alsologtostderr -v 5
ha_test.go:328: (dbg) Done: out/minikube-linux-arm64 -p ha-079309 status --output json --alsologtostderr -v 5: (1.072131397s)
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 cp testdata/cp-test.txt ha-079309:/home/docker/cp-test.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 ssh -n ha-079309 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 cp ha-079309:/home/docker/cp-test.txt /tmp/TestMultiControlPlaneserialCopyFile889509216/001/cp-test_ha-079309.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 ssh -n ha-079309 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 cp ha-079309:/home/docker/cp-test.txt ha-079309-m02:/home/docker/cp-test_ha-079309_ha-079309-m02.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 ssh -n ha-079309 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 ssh -n ha-079309-m02 "sudo cat /home/docker/cp-test_ha-079309_ha-079309-m02.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 cp ha-079309:/home/docker/cp-test.txt ha-079309-m03:/home/docker/cp-test_ha-079309_ha-079309-m03.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 ssh -n ha-079309 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 ssh -n ha-079309-m03 "sudo cat /home/docker/cp-test_ha-079309_ha-079309-m03.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 cp ha-079309:/home/docker/cp-test.txt ha-079309-m04:/home/docker/cp-test_ha-079309_ha-079309-m04.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 ssh -n ha-079309 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 ssh -n ha-079309-m04 "sudo cat /home/docker/cp-test_ha-079309_ha-079309-m04.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 cp testdata/cp-test.txt ha-079309-m02:/home/docker/cp-test.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 ssh -n ha-079309-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 cp ha-079309-m02:/home/docker/cp-test.txt /tmp/TestMultiControlPlaneserialCopyFile889509216/001/cp-test_ha-079309-m02.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 ssh -n ha-079309-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 cp ha-079309-m02:/home/docker/cp-test.txt ha-079309:/home/docker/cp-test_ha-079309-m02_ha-079309.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 ssh -n ha-079309-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 ssh -n ha-079309 "sudo cat /home/docker/cp-test_ha-079309-m02_ha-079309.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 cp ha-079309-m02:/home/docker/cp-test.txt ha-079309-m03:/home/docker/cp-test_ha-079309-m02_ha-079309-m03.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 ssh -n ha-079309-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 ssh -n ha-079309-m03 "sudo cat /home/docker/cp-test_ha-079309-m02_ha-079309-m03.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 cp ha-079309-m02:/home/docker/cp-test.txt ha-079309-m04:/home/docker/cp-test_ha-079309-m02_ha-079309-m04.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 ssh -n ha-079309-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 ssh -n ha-079309-m04 "sudo cat /home/docker/cp-test_ha-079309-m02_ha-079309-m04.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 cp testdata/cp-test.txt ha-079309-m03:/home/docker/cp-test.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 ssh -n ha-079309-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 cp ha-079309-m03:/home/docker/cp-test.txt /tmp/TestMultiControlPlaneserialCopyFile889509216/001/cp-test_ha-079309-m03.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 ssh -n ha-079309-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 cp ha-079309-m03:/home/docker/cp-test.txt ha-079309:/home/docker/cp-test_ha-079309-m03_ha-079309.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 ssh -n ha-079309-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 ssh -n ha-079309 "sudo cat /home/docker/cp-test_ha-079309-m03_ha-079309.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 cp ha-079309-m03:/home/docker/cp-test.txt ha-079309-m02:/home/docker/cp-test_ha-079309-m03_ha-079309-m02.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 ssh -n ha-079309-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 ssh -n ha-079309-m02 "sudo cat /home/docker/cp-test_ha-079309-m03_ha-079309-m02.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 cp ha-079309-m03:/home/docker/cp-test.txt ha-079309-m04:/home/docker/cp-test_ha-079309-m03_ha-079309-m04.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 ssh -n ha-079309-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 ssh -n ha-079309-m04 "sudo cat /home/docker/cp-test_ha-079309-m03_ha-079309-m04.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 cp testdata/cp-test.txt ha-079309-m04:/home/docker/cp-test.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 ssh -n ha-079309-m04 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 cp ha-079309-m04:/home/docker/cp-test.txt /tmp/TestMultiControlPlaneserialCopyFile889509216/001/cp-test_ha-079309-m04.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 ssh -n ha-079309-m04 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 cp ha-079309-m04:/home/docker/cp-test.txt ha-079309:/home/docker/cp-test_ha-079309-m04_ha-079309.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 ssh -n ha-079309-m04 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 ssh -n ha-079309 "sudo cat /home/docker/cp-test_ha-079309-m04_ha-079309.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 cp ha-079309-m04:/home/docker/cp-test.txt ha-079309-m02:/home/docker/cp-test_ha-079309-m04_ha-079309-m02.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 ssh -n ha-079309-m04 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 ssh -n ha-079309-m02 "sudo cat /home/docker/cp-test_ha-079309-m04_ha-079309-m02.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 cp ha-079309-m04:/home/docker/cp-test.txt ha-079309-m03:/home/docker/cp-test_ha-079309-m04_ha-079309-m03.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 ssh -n ha-079309-m04 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 ssh -n ha-079309-m03 "sudo cat /home/docker/cp-test_ha-079309-m04_ha-079309-m03.txt"
--- PASS: TestMultiControlPlane/serial/CopyFile (20.67s)

                                                
                                    
x
+
TestMultiControlPlane/serial/StopSecondaryNode (13.25s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/StopSecondaryNode
ha_test.go:365: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 node stop m02 --alsologtostderr -v 5
ha_test.go:365: (dbg) Done: out/minikube-linux-arm64 -p ha-079309 node stop m02 --alsologtostderr -v 5: (12.405865155s)
ha_test.go:371: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 status --alsologtostderr -v 5
ha_test.go:371: (dbg) Non-zero exit: out/minikube-linux-arm64 -p ha-079309 status --alsologtostderr -v 5: exit status 7 (842.087475ms)

                                                
                                                
-- stdout --
	ha-079309
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-079309-m02
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	ha-079309-m03
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-079309-m04
	type: Worker
	host: Running
	kubelet: Running
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1219 06:32:11.345022 2102143 out.go:360] Setting OutFile to fd 1 ...
	I1219 06:32:11.345154 2102143 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 06:32:11.345167 2102143 out.go:374] Setting ErrFile to fd 2...
	I1219 06:32:11.345172 2102143 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 06:32:11.347301 2102143 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22230-1998525/.minikube/bin
	I1219 06:32:11.347534 2102143 out.go:368] Setting JSON to false
	I1219 06:32:11.347567 2102143 mustload.go:66] Loading cluster: ha-079309
	I1219 06:32:11.347697 2102143 notify.go:221] Checking for updates...
	I1219 06:32:11.347984 2102143 config.go:182] Loaded profile config "ha-079309": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
	I1219 06:32:11.348009 2102143 status.go:174] checking status of ha-079309 ...
	I1219 06:32:11.348549 2102143 cli_runner.go:164] Run: docker container inspect ha-079309 --format={{.State.Status}}
	I1219 06:32:11.372469 2102143 status.go:371] ha-079309 host status = "Running" (err=<nil>)
	I1219 06:32:11.372497 2102143 host.go:66] Checking if "ha-079309" exists ...
	I1219 06:32:11.372912 2102143 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" ha-079309
	I1219 06:32:11.396891 2102143 host.go:66] Checking if "ha-079309" exists ...
	I1219 06:32:11.397206 2102143 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1219 06:32:11.397257 2102143 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" ha-079309
	I1219 06:32:11.415759 2102143 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34709 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/ha-079309/id_rsa Username:docker}
	I1219 06:32:11.538852 2102143 ssh_runner.go:195] Run: systemctl --version
	I1219 06:32:11.545483 2102143 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1219 06:32:11.560807 2102143 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1219 06:32:11.620674 2102143 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:4 ContainersRunning:3 ContainersPaused:0 ContainersStopped:1 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:62 OomKillDisable:true NGoroutines:72 SystemTime:2025-12-19 06:32:11.610217066 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1219 06:32:11.621309 2102143 kubeconfig.go:125] found "ha-079309" server: "https://192.168.49.254:8443"
	I1219 06:32:11.621349 2102143 api_server.go:166] Checking apiserver status ...
	I1219 06:32:11.621401 2102143 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:32:11.636553 2102143 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1433/cgroup
	I1219 06:32:11.646942 2102143 api_server.go:182] apiserver freezer: "9:freezer:/docker/1c8cc95918d38150b76cb681ac6c607c8c271df29f54ffd9cedf0b2e05fbb53f/kubepods/burstable/pod1c63e0b0f98bd935573d034e161c0286/d0ac4a4c085cd0195b11993bd39613914c5ca2dce1debba8968d8c1ca9458de5"
	I1219 06:32:11.647103 2102143 ssh_runner.go:195] Run: sudo cat /sys/fs/cgroup/freezer/docker/1c8cc95918d38150b76cb681ac6c607c8c271df29f54ffd9cedf0b2e05fbb53f/kubepods/burstable/pod1c63e0b0f98bd935573d034e161c0286/d0ac4a4c085cd0195b11993bd39613914c5ca2dce1debba8968d8c1ca9458de5/freezer.state
	I1219 06:32:11.663908 2102143 api_server.go:204] freezer state: "THAWED"
	I1219 06:32:11.663943 2102143 api_server.go:253] Checking apiserver healthz at https://192.168.49.254:8443/healthz ...
	I1219 06:32:11.674167 2102143 api_server.go:279] https://192.168.49.254:8443/healthz returned 200:
	ok
	I1219 06:32:11.674197 2102143 status.go:463] ha-079309 apiserver status = Running (err=<nil>)
	I1219 06:32:11.674208 2102143 status.go:176] ha-079309 status: &{Name:ha-079309 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I1219 06:32:11.674255 2102143 status.go:174] checking status of ha-079309-m02 ...
	I1219 06:32:11.674565 2102143 cli_runner.go:164] Run: docker container inspect ha-079309-m02 --format={{.State.Status}}
	I1219 06:32:11.695779 2102143 status.go:371] ha-079309-m02 host status = "Stopped" (err=<nil>)
	I1219 06:32:11.695802 2102143 status.go:384] host is not running, skipping remaining checks
	I1219 06:32:11.695809 2102143 status.go:176] ha-079309-m02 status: &{Name:ha-079309-m02 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I1219 06:32:11.695829 2102143 status.go:174] checking status of ha-079309-m03 ...
	I1219 06:32:11.696152 2102143 cli_runner.go:164] Run: docker container inspect ha-079309-m03 --format={{.State.Status}}
	I1219 06:32:11.714383 2102143 status.go:371] ha-079309-m03 host status = "Running" (err=<nil>)
	I1219 06:32:11.714410 2102143 host.go:66] Checking if "ha-079309-m03" exists ...
	I1219 06:32:11.714723 2102143 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" ha-079309-m03
	I1219 06:32:11.735426 2102143 host.go:66] Checking if "ha-079309-m03" exists ...
	I1219 06:32:11.735757 2102143 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1219 06:32:11.735804 2102143 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" ha-079309-m03
	I1219 06:32:11.754630 2102143 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34719 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/ha-079309-m03/id_rsa Username:docker}
	I1219 06:32:11.864385 2102143 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1219 06:32:11.883499 2102143 kubeconfig.go:125] found "ha-079309" server: "https://192.168.49.254:8443"
	I1219 06:32:11.883533 2102143 api_server.go:166] Checking apiserver status ...
	I1219 06:32:11.883617 2102143 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:32:11.901528 2102143 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1416/cgroup
	I1219 06:32:11.910718 2102143 api_server.go:182] apiserver freezer: "9:freezer:/docker/b8207b014feb9ce1e7b2b43817490b3aa03429aca413a3c670b3e98c3794f651/kubepods/burstable/pod9665388faa998f96cedea1434df0e4f9/97d53979e83cf9967c271064ca5c938f624dfc433e7f792ed08b5da536135673"
	I1219 06:32:11.910803 2102143 ssh_runner.go:195] Run: sudo cat /sys/fs/cgroup/freezer/docker/b8207b014feb9ce1e7b2b43817490b3aa03429aca413a3c670b3e98c3794f651/kubepods/burstable/pod9665388faa998f96cedea1434df0e4f9/97d53979e83cf9967c271064ca5c938f624dfc433e7f792ed08b5da536135673/freezer.state
	I1219 06:32:11.919146 2102143 api_server.go:204] freezer state: "THAWED"
	I1219 06:32:11.919177 2102143 api_server.go:253] Checking apiserver healthz at https://192.168.49.254:8443/healthz ...
	I1219 06:32:11.927616 2102143 api_server.go:279] https://192.168.49.254:8443/healthz returned 200:
	ok
	I1219 06:32:11.927652 2102143 status.go:463] ha-079309-m03 apiserver status = Running (err=<nil>)
	I1219 06:32:11.927664 2102143 status.go:176] ha-079309-m03 status: &{Name:ha-079309-m03 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I1219 06:32:11.927682 2102143 status.go:174] checking status of ha-079309-m04 ...
	I1219 06:32:11.928010 2102143 cli_runner.go:164] Run: docker container inspect ha-079309-m04 --format={{.State.Status}}
	I1219 06:32:11.946367 2102143 status.go:371] ha-079309-m04 host status = "Running" (err=<nil>)
	I1219 06:32:11.946399 2102143 host.go:66] Checking if "ha-079309-m04" exists ...
	I1219 06:32:11.946704 2102143 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" ha-079309-m04
	I1219 06:32:11.965613 2102143 host.go:66] Checking if "ha-079309-m04" exists ...
	I1219 06:32:11.965919 2102143 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1219 06:32:11.965968 2102143 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" ha-079309-m04
	I1219 06:32:11.984117 2102143 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34724 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/ha-079309-m04/id_rsa Username:docker}
	I1219 06:32:12.098804 2102143 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1219 06:32:12.115247 2102143 status.go:176] ha-079309-m04 status: &{Name:ha-079309-m04 Host:Running Kubelet:Running APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMultiControlPlane/serial/StopSecondaryNode (13.25s)

                                                
                                    
x
+
TestMultiControlPlane/serial/DegradedAfterControlPlaneNodeStop (0.84s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/DegradedAfterControlPlaneNodeStop
ha_test.go:392: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
--- PASS: TestMultiControlPlane/serial/DegradedAfterControlPlaneNodeStop (0.84s)

                                                
                                    
x
+
TestMultiControlPlane/serial/RestartSecondaryNode (14.37s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/RestartSecondaryNode
ha_test.go:422: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 node start m02 --alsologtostderr -v 5
E1219 06:32:18.091433 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
ha_test.go:422: (dbg) Done: out/minikube-linux-arm64 -p ha-079309 node start m02 --alsologtostderr -v 5: (13.105147065s)
ha_test.go:430: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 status --alsologtostderr -v 5
ha_test.go:430: (dbg) Done: out/minikube-linux-arm64 -p ha-079309 status --alsologtostderr -v 5: (1.163659386s)
ha_test.go:450: (dbg) Run:  kubectl get nodes
--- PASS: TestMultiControlPlane/serial/RestartSecondaryNode (14.37s)

                                                
                                    
x
+
TestMultiControlPlane/serial/HAppyAfterSecondaryNodeRestart (1.09s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/HAppyAfterSecondaryNodeRestart
ha_test.go:281: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
ha_test.go:281: (dbg) Done: out/minikube-linux-arm64 profile list --output json: (1.094382258s)
--- PASS: TestMultiControlPlane/serial/HAppyAfterSecondaryNodeRestart (1.09s)

                                                
                                    
x
+
TestMultiControlPlane/serial/RestartClusterKeepsNodes (98.87s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/RestartClusterKeepsNodes
ha_test.go:458: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 node list --alsologtostderr -v 5
ha_test.go:464: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 stop --alsologtostderr -v 5
ha_test.go:464: (dbg) Done: out/minikube-linux-arm64 -p ha-079309 stop --alsologtostderr -v 5: (37.510377651s)
ha_test.go:469: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 start --wait true --alsologtostderr -v 5
ha_test.go:469: (dbg) Done: out/minikube-linux-arm64 -p ha-079309 start --wait true --alsologtostderr -v 5: (1m1.198402224s)
ha_test.go:474: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 node list --alsologtostderr -v 5
--- PASS: TestMultiControlPlane/serial/RestartClusterKeepsNodes (98.87s)

                                                
                                    
x
+
TestMultiControlPlane/serial/DeleteSecondaryNode (11.25s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/DeleteSecondaryNode
ha_test.go:489: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 node delete m03 --alsologtostderr -v 5
ha_test.go:489: (dbg) Done: out/minikube-linux-arm64 -p ha-079309 node delete m03 --alsologtostderr -v 5: (10.18862579s)
ha_test.go:495: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 status --alsologtostderr -v 5
ha_test.go:513: (dbg) Run:  kubectl get nodes
ha_test.go:521: (dbg) Run:  kubectl get nodes -o "go-template='{{range .items}}{{range .status.conditions}}{{if eq .type "Ready"}} {{.status}}{{"\n"}}{{end}}{{end}}{{end}}'"
--- PASS: TestMultiControlPlane/serial/DeleteSecondaryNode (11.25s)

                                                
                                    
x
+
TestMultiControlPlane/serial/DegradedAfterSecondaryNodeDelete (0.81s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/DegradedAfterSecondaryNodeDelete
ha_test.go:392: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
--- PASS: TestMultiControlPlane/serial/DegradedAfterSecondaryNodeDelete (0.81s)

                                                
                                    
x
+
TestMultiControlPlane/serial/StopCluster (36.4s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/StopCluster
ha_test.go:533: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 stop --alsologtostderr -v 5
E1219 06:34:32.454599 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-125117/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 06:34:34.247575 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
ha_test.go:533: (dbg) Done: out/minikube-linux-arm64 -p ha-079309 stop --alsologtostderr -v 5: (36.280338801s)
ha_test.go:539: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 status --alsologtostderr -v 5
ha_test.go:539: (dbg) Non-zero exit: out/minikube-linux-arm64 -p ha-079309 status --alsologtostderr -v 5: exit status 7 (117.837695ms)

                                                
                                                
-- stdout --
	ha-079309
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	ha-079309-m02
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	ha-079309-m04
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1219 06:34:55.685047 2116862 out.go:360] Setting OutFile to fd 1 ...
	I1219 06:34:55.685238 2116862 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 06:34:55.685265 2116862 out.go:374] Setting ErrFile to fd 2...
	I1219 06:34:55.685286 2116862 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 06:34:55.685567 2116862 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22230-1998525/.minikube/bin
	I1219 06:34:55.685790 2116862 out.go:368] Setting JSON to false
	I1219 06:34:55.685847 2116862 mustload.go:66] Loading cluster: ha-079309
	I1219 06:34:55.685937 2116862 notify.go:221] Checking for updates...
	I1219 06:34:55.686382 2116862 config.go:182] Loaded profile config "ha-079309": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
	I1219 06:34:55.686729 2116862 status.go:174] checking status of ha-079309 ...
	I1219 06:34:55.687305 2116862 cli_runner.go:164] Run: docker container inspect ha-079309 --format={{.State.Status}}
	I1219 06:34:55.711264 2116862 status.go:371] ha-079309 host status = "Stopped" (err=<nil>)
	I1219 06:34:55.711290 2116862 status.go:384] host is not running, skipping remaining checks
	I1219 06:34:55.711297 2116862 status.go:176] ha-079309 status: &{Name:ha-079309 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I1219 06:34:55.711327 2116862 status.go:174] checking status of ha-079309-m02 ...
	I1219 06:34:55.711635 2116862 cli_runner.go:164] Run: docker container inspect ha-079309-m02 --format={{.State.Status}}
	I1219 06:34:55.735822 2116862 status.go:371] ha-079309-m02 host status = "Stopped" (err=<nil>)
	I1219 06:34:55.735842 2116862 status.go:384] host is not running, skipping remaining checks
	I1219 06:34:55.735848 2116862 status.go:176] ha-079309-m02 status: &{Name:ha-079309-m02 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I1219 06:34:55.735866 2116862 status.go:174] checking status of ha-079309-m04 ...
	I1219 06:34:55.736175 2116862 cli_runner.go:164] Run: docker container inspect ha-079309-m04 --format={{.State.Status}}
	I1219 06:34:55.754112 2116862 status.go:371] ha-079309-m04 host status = "Stopped" (err=<nil>)
	I1219 06:34:55.754138 2116862 status.go:384] host is not running, skipping remaining checks
	I1219 06:34:55.754145 2116862 status.go:176] ha-079309-m04 status: &{Name:ha-079309-m04 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMultiControlPlane/serial/StopCluster (36.40s)

                                                
                                    
x
+
TestMultiControlPlane/serial/RestartCluster (60.38s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/RestartCluster
ha_test.go:562: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 start --wait true --alsologtostderr -v 5 --driver=docker  --container-runtime=containerd
E1219 06:35:01.932001 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 06:35:27.486147 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/addons-064622/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
ha_test.go:562: (dbg) Done: out/minikube-linux-arm64 -p ha-079309 start --wait true --alsologtostderr -v 5 --driver=docker  --container-runtime=containerd: (59.382571276s)
ha_test.go:568: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 status --alsologtostderr -v 5
ha_test.go:586: (dbg) Run:  kubectl get nodes
ha_test.go:594: (dbg) Run:  kubectl get nodes -o "go-template='{{range .items}}{{range .status.conditions}}{{if eq .type "Ready"}} {{.status}}{{"\n"}}{{end}}{{end}}{{end}}'"
--- PASS: TestMultiControlPlane/serial/RestartCluster (60.38s)

                                                
                                    
x
+
TestMultiControlPlane/serial/DegradedAfterClusterRestart (0.87s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/DegradedAfterClusterRestart
ha_test.go:392: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
--- PASS: TestMultiControlPlane/serial/DegradedAfterClusterRestart (0.87s)

                                                
                                    
x
+
TestMultiControlPlane/serial/AddSecondaryNode (61.39s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/AddSecondaryNode
ha_test.go:607: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 node add --control-plane --alsologtostderr -v 5
E1219 06:36:29.406710 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-125117/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
ha_test.go:607: (dbg) Done: out/minikube-linux-arm64 -p ha-079309 node add --control-plane --alsologtostderr -v 5: (1m0.266351104s)
ha_test.go:613: (dbg) Run:  out/minikube-linux-arm64 -p ha-079309 status --alsologtostderr -v 5
ha_test.go:613: (dbg) Done: out/minikube-linux-arm64 -p ha-079309 status --alsologtostderr -v 5: (1.122965406s)
--- PASS: TestMultiControlPlane/serial/AddSecondaryNode (61.39s)

                                                
                                    
x
+
TestMultiControlPlane/serial/HAppyAfterSecondaryNodeAdd (1.1s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/HAppyAfterSecondaryNodeAdd
ha_test.go:281: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
ha_test.go:281: (dbg) Done: out/minikube-linux-arm64 profile list --output json: (1.095408341s)
--- PASS: TestMultiControlPlane/serial/HAppyAfterSecondaryNodeAdd (1.10s)

                                                
                                    
x
+
TestJSONOutput/start/Command (55.03s)

                                                
                                                
=== RUN   TestJSONOutput/start/Command
json_output_test.go:63: (dbg) Run:  out/minikube-linux-arm64 start -p json-output-270201 --output=json --user=testUser --memory=3072 --wait=true --driver=docker  --container-runtime=containerd
json_output_test.go:63: (dbg) Done: out/minikube-linux-arm64 start -p json-output-270201 --output=json --user=testUser --memory=3072 --wait=true --driver=docker  --container-runtime=containerd: (55.025467838s)
--- PASS: TestJSONOutput/start/Command (55.03s)

                                                
                                    
x
+
TestJSONOutput/start/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/Audit
--- PASS: TestJSONOutput/start/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/start/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/start/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/start/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/start/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/start/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/start/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/start/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/start/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/Command (0.73s)

                                                
                                                
=== RUN   TestJSONOutput/pause/Command
json_output_test.go:63: (dbg) Run:  out/minikube-linux-arm64 pause -p json-output-270201 --output=json --user=testUser
--- PASS: TestJSONOutput/pause/Command (0.73s)

                                                
                                    
x
+
TestJSONOutput/pause/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/Audit
--- PASS: TestJSONOutput/pause/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/pause/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/pause/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/pause/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/pause/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/pause/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/pause/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/Command (0.66s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/Command
json_output_test.go:63: (dbg) Run:  out/minikube-linux-arm64 unpause -p json-output-270201 --output=json --user=testUser
--- PASS: TestJSONOutput/unpause/Command (0.66s)

                                                
                                    
x
+
TestJSONOutput/unpause/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/Audit
--- PASS: TestJSONOutput/unpause/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/unpause/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/unpause/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/unpause/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/unpause/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/unpause/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/unpause/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/Command (5.99s)

                                                
                                                
=== RUN   TestJSONOutput/stop/Command
json_output_test.go:63: (dbg) Run:  out/minikube-linux-arm64 stop -p json-output-270201 --output=json --user=testUser
json_output_test.go:63: (dbg) Done: out/minikube-linux-arm64 stop -p json-output-270201 --output=json --user=testUser: (5.986290933s)
--- PASS: TestJSONOutput/stop/Command (5.99s)

                                                
                                    
x
+
TestJSONOutput/stop/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/Audit
--- PASS: TestJSONOutput/stop/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/stop/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/stop/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/stop/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/stop/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/stop/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/stop/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestErrorJSONOutput (0.24s)

                                                
                                                
=== RUN   TestErrorJSONOutput
json_output_test.go:160: (dbg) Run:  out/minikube-linux-arm64 start -p json-output-error-970976 --memory=3072 --output=json --wait=true --driver=fail
json_output_test.go:160: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p json-output-error-970976 --memory=3072 --output=json --wait=true --driver=fail: exit status 56 (93.191896ms)

                                                
                                                
-- stdout --
	{"specversion":"1.0","id":"9721fb37-b70e-4b84-84b2-41dd38fb8990","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"0","message":"[json-output-error-970976] minikube v1.37.0 on Ubuntu 20.04 (arm64)","name":"Initial Minikube Setup","totalsteps":"19"}}
	{"specversion":"1.0","id":"fd7dcb5d-2c75-421a-86ed-e317b6714b18","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_LOCATION=22230"}}
	{"specversion":"1.0","id":"4fa4ce2e-82e8-4f83-ad0d-6bfb34b99d7a","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true"}}
	{"specversion":"1.0","id":"616a6a3d-e283-4c97-960c-6982638afbc8","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"KUBECONFIG=/home/jenkins/minikube-integration/22230-1998525/kubeconfig"}}
	{"specversion":"1.0","id":"aa51569a-13a2-40a8-9970-c53b8abde313","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_HOME=/home/jenkins/minikube-integration/22230-1998525/.minikube"}}
	{"specversion":"1.0","id":"a96144dd-2757-4a98-b1cc-0172abde0da1","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_BIN=out/minikube-linux-arm64"}}
	{"specversion":"1.0","id":"0993cdf5-a088-4ceb-a368-a82210059c30","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_FORCE_SYSTEMD="}}
	{"specversion":"1.0","id":"d9c41f05-4849-4cb3-830c-168b828d5c49","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.error","datacontenttype":"application/json","data":{"advice":"","exitcode":"56","issues":"","message":"The driver 'fail' is not supported on linux/arm64","name":"DRV_UNSUPPORTED_OS","url":""}}

                                                
                                                
-- /stdout --
helpers_test.go:176: Cleaning up "json-output-error-970976" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p json-output-error-970976
--- PASS: TestErrorJSONOutput (0.24s)

                                                
                                    
x
+
TestKicCustomNetwork/create_custom_network (36.64s)

                                                
                                                
=== RUN   TestKicCustomNetwork/create_custom_network
kic_custom_network_test.go:57: (dbg) Run:  out/minikube-linux-arm64 start -p docker-network-243549 --network=
kic_custom_network_test.go:57: (dbg) Done: out/minikube-linux-arm64 start -p docker-network-243549 --network=: (34.36859012s)
kic_custom_network_test.go:150: (dbg) Run:  docker network ls --format {{.Name}}
helpers_test.go:176: Cleaning up "docker-network-243549" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p docker-network-243549
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p docker-network-243549: (2.23994199s)
--- PASS: TestKicCustomNetwork/create_custom_network (36.64s)

                                                
                                    
x
+
TestKicCustomNetwork/use_default_bridge_network (37.34s)

                                                
                                                
=== RUN   TestKicCustomNetwork/use_default_bridge_network
kic_custom_network_test.go:57: (dbg) Run:  out/minikube-linux-arm64 start -p docker-network-492658 --network=bridge
kic_custom_network_test.go:57: (dbg) Done: out/minikube-linux-arm64 start -p docker-network-492658 --network=bridge: (35.234877032s)
kic_custom_network_test.go:150: (dbg) Run:  docker network ls --format {{.Name}}
helpers_test.go:176: Cleaning up "docker-network-492658" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p docker-network-492658
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p docker-network-492658: (2.083817615s)
--- PASS: TestKicCustomNetwork/use_default_bridge_network (37.34s)

                                                
                                    
x
+
TestKicExistingNetwork (34.36s)

                                                
                                                
=== RUN   TestKicExistingNetwork
I1219 06:39:29.490532 2000386 cli_runner.go:164] Run: docker network inspect existing-network --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
W1219 06:39:29.507715 2000386 cli_runner.go:211] docker network inspect existing-network --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}" returned with exit code 1
I1219 06:39:29.507790 2000386 network_create.go:284] running [docker network inspect existing-network] to gather additional debugging logs...
I1219 06:39:29.507808 2000386 cli_runner.go:164] Run: docker network inspect existing-network
W1219 06:39:29.524166 2000386 cli_runner.go:211] docker network inspect existing-network returned with exit code 1
I1219 06:39:29.524194 2000386 network_create.go:287] error running [docker network inspect existing-network]: docker network inspect existing-network: exit status 1
stdout:
[]

                                                
                                                
stderr:
Error response from daemon: network existing-network not found
I1219 06:39:29.524212 2000386 network_create.go:289] output of [docker network inspect existing-network]: -- stdout --
[]

                                                
                                                
-- /stdout --
** stderr ** 
Error response from daemon: network existing-network not found

                                                
                                                
** /stderr **
I1219 06:39:29.524332 2000386 cli_runner.go:164] Run: docker network inspect bridge --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
I1219 06:39:29.543054 2000386 network.go:211] skipping subnet 192.168.49.0/24 that is taken: &{IP:192.168.49.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.49.0/24 Gateway:192.168.49.1 ClientMin:192.168.49.2 ClientMax:192.168.49.254 Broadcast:192.168.49.255 IsPrivate:true Interface:{IfaceName:br-e2ac0574767c IfaceIPv4:192.168.49.1 IfaceMTU:1500 IfaceMAC:e2:05:6c:fe:29:8c} reservation:<nil>}
I1219 06:39:29.543369 2000386 network.go:206] using free private subnet 192.168.58.0/24: &{IP:192.168.58.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.58.0/24 Gateway:192.168.58.1 ClientMin:192.168.58.2 ClientMax:192.168.58.254 Broadcast:192.168.58.255 IsPrivate:true Interface:{IfaceName: IfaceIPv4: IfaceMTU:0 IfaceMAC:} reservation:0x4000011c20}
I1219 06:39:29.543397 2000386 network_create.go:124] attempt to create docker network existing-network 192.168.58.0/24 with gateway 192.168.58.1 and MTU of 1500 ...
I1219 06:39:29.543448 2000386 cli_runner.go:164] Run: docker network create --driver=bridge --subnet=192.168.58.0/24 --gateway=192.168.58.1 -o --ip-masq -o --icc -o com.docker.network.driver.mtu=1500 --label=created_by.minikube.sigs.k8s.io=true --label=name.minikube.sigs.k8s.io=existing-network existing-network
I1219 06:39:29.601128 2000386 network_create.go:108] docker network existing-network 192.168.58.0/24 created
kic_custom_network_test.go:150: (dbg) Run:  docker network ls --format {{.Name}}
kic_custom_network_test.go:93: (dbg) Run:  out/minikube-linux-arm64 start -p existing-network-205678 --network=existing-network
E1219 06:39:34.246972 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
kic_custom_network_test.go:93: (dbg) Done: out/minikube-linux-arm64 start -p existing-network-205678 --network=existing-network: (32.048663346s)
helpers_test.go:176: Cleaning up "existing-network-205678" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p existing-network-205678
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p existing-network-205678: (2.170662543s)
I1219 06:40:03.837099 2000386 cli_runner.go:164] Run: docker network ls --filter=label=existing-network --format {{.Name}}
--- PASS: TestKicExistingNetwork (34.36s)

                                                
                                    
x
+
TestKicCustomSubnet (34.85s)

                                                
                                                
=== RUN   TestKicCustomSubnet
kic_custom_network_test.go:112: (dbg) Run:  out/minikube-linux-arm64 start -p custom-subnet-359572 --subnet=192.168.60.0/24
E1219 06:40:10.534763 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/addons-064622/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 06:40:27.486036 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/addons-064622/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
kic_custom_network_test.go:112: (dbg) Done: out/minikube-linux-arm64 start -p custom-subnet-359572 --subnet=192.168.60.0/24: (32.567470265s)
kic_custom_network_test.go:161: (dbg) Run:  docker network inspect custom-subnet-359572 --format "{{(index .IPAM.Config 0).Subnet}}"
helpers_test.go:176: Cleaning up "custom-subnet-359572" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p custom-subnet-359572
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p custom-subnet-359572: (2.256202189s)
--- PASS: TestKicCustomSubnet (34.85s)

                                                
                                    
x
+
TestKicStaticIP (35.58s)

                                                
                                                
=== RUN   TestKicStaticIP
kic_custom_network_test.go:132: (dbg) Run:  out/minikube-linux-arm64 start -p static-ip-163968 --static-ip=192.168.200.200
kic_custom_network_test.go:132: (dbg) Done: out/minikube-linux-arm64 start -p static-ip-163968 --static-ip=192.168.200.200: (33.198728936s)
kic_custom_network_test.go:138: (dbg) Run:  out/minikube-linux-arm64 -p static-ip-163968 ip
helpers_test.go:176: Cleaning up "static-ip-163968" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p static-ip-163968
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p static-ip-163968: (2.211005823s)
--- PASS: TestKicStaticIP (35.58s)

                                                
                                    
x
+
TestMainNoArgs (0.06s)

                                                
                                                
=== RUN   TestMainNoArgs
main_test.go:70: (dbg) Run:  out/minikube-linux-arm64
--- PASS: TestMainNoArgs (0.06s)

                                                
                                    
x
+
TestMinikubeProfile (75.09s)

                                                
                                                
=== RUN   TestMinikubeProfile
minikube_profile_test.go:44: (dbg) Run:  out/minikube-linux-arm64 start -p first-243301 --driver=docker  --container-runtime=containerd
E1219 06:41:29.406628 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-125117/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
minikube_profile_test.go:44: (dbg) Done: out/minikube-linux-arm64 start -p first-243301 --driver=docker  --container-runtime=containerd: (32.550188232s)
minikube_profile_test.go:44: (dbg) Run:  out/minikube-linux-arm64 start -p second-245787 --driver=docker  --container-runtime=containerd
minikube_profile_test.go:44: (dbg) Done: out/minikube-linux-arm64 start -p second-245787 --driver=docker  --container-runtime=containerd: (36.471496317s)
minikube_profile_test.go:51: (dbg) Run:  out/minikube-linux-arm64 profile first-243301
minikube_profile_test.go:55: (dbg) Run:  out/minikube-linux-arm64 profile list -ojson
minikube_profile_test.go:51: (dbg) Run:  out/minikube-linux-arm64 profile second-245787
minikube_profile_test.go:55: (dbg) Run:  out/minikube-linux-arm64 profile list -ojson
helpers_test.go:176: Cleaning up "second-245787" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p second-245787
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p second-245787: (2.08132706s)
helpers_test.go:176: Cleaning up "first-243301" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p first-243301
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p first-243301: (2.478548631s)
--- PASS: TestMinikubeProfile (75.09s)

                                                
                                    
x
+
TestMountStart/serial/StartWithMountFirst (8.59s)

                                                
                                                
=== RUN   TestMountStart/serial/StartWithMountFirst
mount_start_test.go:118: (dbg) Run:  out/minikube-linux-arm64 start -p mount-start-1-783284 --memory=3072 --mount-string /tmp/TestMountStartserial278301195/001:/minikube-host --mount-gid 0 --mount-msize 6543 --mount-port 46464 --mount-uid 0 --no-kubernetes --driver=docker  --container-runtime=containerd
mount_start_test.go:118: (dbg) Done: out/minikube-linux-arm64 start -p mount-start-1-783284 --memory=3072 --mount-string /tmp/TestMountStartserial278301195/001:/minikube-host --mount-gid 0 --mount-msize 6543 --mount-port 46464 --mount-uid 0 --no-kubernetes --driver=docker  --container-runtime=containerd: (7.585561878s)
--- PASS: TestMountStart/serial/StartWithMountFirst (8.59s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountFirst (0.28s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountFirst
mount_start_test.go:134: (dbg) Run:  out/minikube-linux-arm64 -p mount-start-1-783284 ssh -- ls /minikube-host
--- PASS: TestMountStart/serial/VerifyMountFirst (0.28s)

                                                
                                    
x
+
TestMountStart/serial/StartWithMountSecond (6.06s)

                                                
                                                
=== RUN   TestMountStart/serial/StartWithMountSecond
mount_start_test.go:118: (dbg) Run:  out/minikube-linux-arm64 start -p mount-start-2-785302 --memory=3072 --mount-string /tmp/TestMountStartserial278301195/001:/minikube-host --mount-gid 0 --mount-msize 6543 --mount-port 46465 --mount-uid 0 --no-kubernetes --driver=docker  --container-runtime=containerd
mount_start_test.go:118: (dbg) Done: out/minikube-linux-arm64 start -p mount-start-2-785302 --memory=3072 --mount-string /tmp/TestMountStartserial278301195/001:/minikube-host --mount-gid 0 --mount-msize 6543 --mount-port 46465 --mount-uid 0 --no-kubernetes --driver=docker  --container-runtime=containerd: (5.063105174s)
--- PASS: TestMountStart/serial/StartWithMountSecond (6.06s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountSecond (0.27s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountSecond
mount_start_test.go:134: (dbg) Run:  out/minikube-linux-arm64 -p mount-start-2-785302 ssh -- ls /minikube-host
--- PASS: TestMountStart/serial/VerifyMountSecond (0.27s)

                                                
                                    
x
+
TestMountStart/serial/DeleteFirst (1.77s)

                                                
                                                
=== RUN   TestMountStart/serial/DeleteFirst
pause_test.go:132: (dbg) Run:  out/minikube-linux-arm64 delete -p mount-start-1-783284 --alsologtostderr -v=5
pause_test.go:132: (dbg) Done: out/minikube-linux-arm64 delete -p mount-start-1-783284 --alsologtostderr -v=5: (1.772460906s)
--- PASS: TestMountStart/serial/DeleteFirst (1.77s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountPostDelete (0.28s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountPostDelete
mount_start_test.go:134: (dbg) Run:  out/minikube-linux-arm64 -p mount-start-2-785302 ssh -- ls /minikube-host
--- PASS: TestMountStart/serial/VerifyMountPostDelete (0.28s)

                                                
                                    
x
+
TestMountStart/serial/Stop (1.3s)

                                                
                                                
=== RUN   TestMountStart/serial/Stop
mount_start_test.go:196: (dbg) Run:  out/minikube-linux-arm64 stop -p mount-start-2-785302
mount_start_test.go:196: (dbg) Done: out/minikube-linux-arm64 stop -p mount-start-2-785302: (1.296700758s)
--- PASS: TestMountStart/serial/Stop (1.30s)

                                                
                                    
x
+
TestMountStart/serial/RestartStopped (7.69s)

                                                
                                                
=== RUN   TestMountStart/serial/RestartStopped
mount_start_test.go:207: (dbg) Run:  out/minikube-linux-arm64 start -p mount-start-2-785302
mount_start_test.go:207: (dbg) Done: out/minikube-linux-arm64 start -p mount-start-2-785302: (6.690552386s)
--- PASS: TestMountStart/serial/RestartStopped (7.69s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountPostStop (0.28s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountPostStop
mount_start_test.go:134: (dbg) Run:  out/minikube-linux-arm64 -p mount-start-2-785302 ssh -- ls /minikube-host
--- PASS: TestMountStart/serial/VerifyMountPostStop (0.28s)

                                                
                                    
x
+
TestMultiNode/serial/FreshStart2Nodes (80.89s)

                                                
                                                
=== RUN   TestMultiNode/serial/FreshStart2Nodes
multinode_test.go:96: (dbg) Run:  out/minikube-linux-arm64 start -p multinode-613405 --wait=true --memory=3072 --nodes=2 -v=5 --alsologtostderr --driver=docker  --container-runtime=containerd
multinode_test.go:96: (dbg) Done: out/minikube-linux-arm64 start -p multinode-613405 --wait=true --memory=3072 --nodes=2 -v=5 --alsologtostderr --driver=docker  --container-runtime=containerd: (1m20.33453245s)
multinode_test.go:102: (dbg) Run:  out/minikube-linux-arm64 -p multinode-613405 status --alsologtostderr
--- PASS: TestMultiNode/serial/FreshStart2Nodes (80.89s)

                                                
                                    
x
+
TestMultiNode/serial/DeployApp2Nodes (7.07s)

                                                
                                                
=== RUN   TestMultiNode/serial/DeployApp2Nodes
multinode_test.go:493: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-613405 -- apply -f ./testdata/multinodes/multinode-pod-dns-test.yaml
multinode_test.go:498: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-613405 -- rollout status deployment/busybox
multinode_test.go:498: (dbg) Done: out/minikube-linux-arm64 kubectl -p multinode-613405 -- rollout status deployment/busybox: (5.210158301s)
multinode_test.go:505: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-613405 -- get pods -o jsonpath='{.items[*].status.podIP}'
multinode_test.go:528: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-613405 -- get pods -o jsonpath='{.items[*].metadata.name}'
multinode_test.go:536: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-613405 -- exec busybox-7b57f96db7-f67br -- nslookup kubernetes.io
multinode_test.go:536: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-613405 -- exec busybox-7b57f96db7-pjqht -- nslookup kubernetes.io
multinode_test.go:546: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-613405 -- exec busybox-7b57f96db7-f67br -- nslookup kubernetes.default
multinode_test.go:546: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-613405 -- exec busybox-7b57f96db7-pjqht -- nslookup kubernetes.default
multinode_test.go:554: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-613405 -- exec busybox-7b57f96db7-f67br -- nslookup kubernetes.default.svc.cluster.local
multinode_test.go:554: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-613405 -- exec busybox-7b57f96db7-pjqht -- nslookup kubernetes.default.svc.cluster.local
--- PASS: TestMultiNode/serial/DeployApp2Nodes (7.07s)

                                                
                                    
x
+
TestMultiNode/serial/PingHostFrom2Pods (1.04s)

                                                
                                                
=== RUN   TestMultiNode/serial/PingHostFrom2Pods
multinode_test.go:564: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-613405 -- get pods -o jsonpath='{.items[*].metadata.name}'
multinode_test.go:572: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-613405 -- exec busybox-7b57f96db7-f67br -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
multinode_test.go:583: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-613405 -- exec busybox-7b57f96db7-f67br -- sh -c "ping -c 1 192.168.67.1"
multinode_test.go:572: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-613405 -- exec busybox-7b57f96db7-pjqht -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
multinode_test.go:583: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-613405 -- exec busybox-7b57f96db7-pjqht -- sh -c "ping -c 1 192.168.67.1"
--- PASS: TestMultiNode/serial/PingHostFrom2Pods (1.04s)

                                                
                                    
x
+
TestMultiNode/serial/AddNode (28.83s)

                                                
                                                
=== RUN   TestMultiNode/serial/AddNode
multinode_test.go:121: (dbg) Run:  out/minikube-linux-arm64 node add -p multinode-613405 -v=5 --alsologtostderr
E1219 06:44:34.247426 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
multinode_test.go:121: (dbg) Done: out/minikube-linux-arm64 node add -p multinode-613405 -v=5 --alsologtostderr: (28.111753895s)
multinode_test.go:127: (dbg) Run:  out/minikube-linux-arm64 -p multinode-613405 status --alsologtostderr
--- PASS: TestMultiNode/serial/AddNode (28.83s)

                                                
                                    
x
+
TestMultiNode/serial/MultiNodeLabels (0.09s)

                                                
                                                
=== RUN   TestMultiNode/serial/MultiNodeLabels
multinode_test.go:221: (dbg) Run:  kubectl --context multinode-613405 get nodes -o "jsonpath=[{range .items[*]}{.metadata.labels},{end}]"
--- PASS: TestMultiNode/serial/MultiNodeLabels (0.09s)

                                                
                                    
x
+
TestMultiNode/serial/ProfileList (0.73s)

                                                
                                                
=== RUN   TestMultiNode/serial/ProfileList
multinode_test.go:143: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
--- PASS: TestMultiNode/serial/ProfileList (0.73s)

                                                
                                    
x
+
TestMultiNode/serial/CopyFile (11.13s)

                                                
                                                
=== RUN   TestMultiNode/serial/CopyFile
multinode_test.go:184: (dbg) Run:  out/minikube-linux-arm64 -p multinode-613405 status --output json --alsologtostderr
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p multinode-613405 cp testdata/cp-test.txt multinode-613405:/home/docker/cp-test.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-613405 ssh -n multinode-613405 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p multinode-613405 cp multinode-613405:/home/docker/cp-test.txt /tmp/TestMultiNodeserialCopyFile2640788821/001/cp-test_multinode-613405.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-613405 ssh -n multinode-613405 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p multinode-613405 cp multinode-613405:/home/docker/cp-test.txt multinode-613405-m02:/home/docker/cp-test_multinode-613405_multinode-613405-m02.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-613405 ssh -n multinode-613405 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-613405 ssh -n multinode-613405-m02 "sudo cat /home/docker/cp-test_multinode-613405_multinode-613405-m02.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p multinode-613405 cp multinode-613405:/home/docker/cp-test.txt multinode-613405-m03:/home/docker/cp-test_multinode-613405_multinode-613405-m03.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-613405 ssh -n multinode-613405 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-613405 ssh -n multinode-613405-m03 "sudo cat /home/docker/cp-test_multinode-613405_multinode-613405-m03.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p multinode-613405 cp testdata/cp-test.txt multinode-613405-m02:/home/docker/cp-test.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-613405 ssh -n multinode-613405-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p multinode-613405 cp multinode-613405-m02:/home/docker/cp-test.txt /tmp/TestMultiNodeserialCopyFile2640788821/001/cp-test_multinode-613405-m02.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-613405 ssh -n multinode-613405-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p multinode-613405 cp multinode-613405-m02:/home/docker/cp-test.txt multinode-613405:/home/docker/cp-test_multinode-613405-m02_multinode-613405.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-613405 ssh -n multinode-613405-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-613405 ssh -n multinode-613405 "sudo cat /home/docker/cp-test_multinode-613405-m02_multinode-613405.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p multinode-613405 cp multinode-613405-m02:/home/docker/cp-test.txt multinode-613405-m03:/home/docker/cp-test_multinode-613405-m02_multinode-613405-m03.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-613405 ssh -n multinode-613405-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-613405 ssh -n multinode-613405-m03 "sudo cat /home/docker/cp-test_multinode-613405-m02_multinode-613405-m03.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p multinode-613405 cp testdata/cp-test.txt multinode-613405-m03:/home/docker/cp-test.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-613405 ssh -n multinode-613405-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p multinode-613405 cp multinode-613405-m03:/home/docker/cp-test.txt /tmp/TestMultiNodeserialCopyFile2640788821/001/cp-test_multinode-613405-m03.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-613405 ssh -n multinode-613405-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p multinode-613405 cp multinode-613405-m03:/home/docker/cp-test.txt multinode-613405:/home/docker/cp-test_multinode-613405-m03_multinode-613405.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-613405 ssh -n multinode-613405-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-613405 ssh -n multinode-613405 "sudo cat /home/docker/cp-test_multinode-613405-m03_multinode-613405.txt"
helpers_test.go:574: (dbg) Run:  out/minikube-linux-arm64 -p multinode-613405 cp multinode-613405-m03:/home/docker/cp-test.txt multinode-613405-m02:/home/docker/cp-test_multinode-613405-m03_multinode-613405-m02.txt
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-613405 ssh -n multinode-613405-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:552: (dbg) Run:  out/minikube-linux-arm64 -p multinode-613405 ssh -n multinode-613405-m02 "sudo cat /home/docker/cp-test_multinode-613405-m03_multinode-613405-m02.txt"
--- PASS: TestMultiNode/serial/CopyFile (11.13s)

                                                
                                    
x
+
TestMultiNode/serial/StopNode (2.44s)

                                                
                                                
=== RUN   TestMultiNode/serial/StopNode
multinode_test.go:248: (dbg) Run:  out/minikube-linux-arm64 -p multinode-613405 node stop m03
multinode_test.go:248: (dbg) Done: out/minikube-linux-arm64 -p multinode-613405 node stop m03: (1.317547952s)
multinode_test.go:254: (dbg) Run:  out/minikube-linux-arm64 -p multinode-613405 status
multinode_test.go:254: (dbg) Non-zero exit: out/minikube-linux-arm64 -p multinode-613405 status: exit status 7 (575.805552ms)

                                                
                                                
-- stdout --
	multinode-613405
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	multinode-613405-m02
	type: Worker
	host: Running
	kubelet: Running
	
	multinode-613405-m03
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
multinode_test.go:261: (dbg) Run:  out/minikube-linux-arm64 -p multinode-613405 status --alsologtostderr
multinode_test.go:261: (dbg) Non-zero exit: out/minikube-linux-arm64 -p multinode-613405 status --alsologtostderr: exit status 7 (547.669432ms)

                                                
                                                
-- stdout --
	multinode-613405
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	multinode-613405-m02
	type: Worker
	host: Running
	kubelet: Running
	
	multinode-613405-m03
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1219 06:45:09.569591 2170338 out.go:360] Setting OutFile to fd 1 ...
	I1219 06:45:09.569832 2170338 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 06:45:09.569844 2170338 out.go:374] Setting ErrFile to fd 2...
	I1219 06:45:09.569850 2170338 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 06:45:09.570123 2170338 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22230-1998525/.minikube/bin
	I1219 06:45:09.570340 2170338 out.go:368] Setting JSON to false
	I1219 06:45:09.570384 2170338 mustload.go:66] Loading cluster: multinode-613405
	I1219 06:45:09.570475 2170338 notify.go:221] Checking for updates...
	I1219 06:45:09.570836 2170338 config.go:182] Loaded profile config "multinode-613405": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
	I1219 06:45:09.570859 2170338 status.go:174] checking status of multinode-613405 ...
	I1219 06:45:09.571721 2170338 cli_runner.go:164] Run: docker container inspect multinode-613405 --format={{.State.Status}}
	I1219 06:45:09.591277 2170338 status.go:371] multinode-613405 host status = "Running" (err=<nil>)
	I1219 06:45:09.591303 2170338 host.go:66] Checking if "multinode-613405" exists ...
	I1219 06:45:09.591682 2170338 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" multinode-613405
	I1219 06:45:09.622004 2170338 host.go:66] Checking if "multinode-613405" exists ...
	I1219 06:45:09.622312 2170338 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1219 06:45:09.622361 2170338 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" multinode-613405
	I1219 06:45:09.641725 2170338 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34829 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/multinode-613405/id_rsa Username:docker}
	I1219 06:45:09.746469 2170338 ssh_runner.go:195] Run: systemctl --version
	I1219 06:45:09.753350 2170338 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1219 06:45:09.766660 2170338 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1219 06:45:09.826755 2170338 info.go:266] docker info: {ID:6ZPO:QZND:VNGE:LUKL:4Y3K:XELL:AAX4:2GTK:E6LM:MPRN:3ZXR:TTMR Containers:3 ContainersRunning:2 ContainersPaused:0 ContainersStopped:1 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:50 OomKillDisable:true NGoroutines:62 SystemTime:2025-12-19 06:45:09.81712144 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:aa
rch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214835200 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-30-239 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pa
th:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1219 06:45:09.827315 2170338 kubeconfig.go:125] found "multinode-613405" server: "https://192.168.67.2:8443"
	I1219 06:45:09.827352 2170338 api_server.go:166] Checking apiserver status ...
	I1219 06:45:09.827399 2170338 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1219 06:45:09.840030 2170338 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1402/cgroup
	I1219 06:45:09.849057 2170338 api_server.go:182] apiserver freezer: "9:freezer:/docker/d964a2fb518c0c17aa565881308848e39b0a42b3ee0e92ad20c4eb2c093136bf/kubepods/burstable/pod4a277d1e07657d64539f2c361066598c/3953fd4203368b4d337ca2ef9cc6b4d1f9ace4cf1056d86fc76bbda39a98326d"
	I1219 06:45:09.849138 2170338 ssh_runner.go:195] Run: sudo cat /sys/fs/cgroup/freezer/docker/d964a2fb518c0c17aa565881308848e39b0a42b3ee0e92ad20c4eb2c093136bf/kubepods/burstable/pod4a277d1e07657d64539f2c361066598c/3953fd4203368b4d337ca2ef9cc6b4d1f9ace4cf1056d86fc76bbda39a98326d/freezer.state
	I1219 06:45:09.857083 2170338 api_server.go:204] freezer state: "THAWED"
	I1219 06:45:09.857113 2170338 api_server.go:253] Checking apiserver healthz at https://192.168.67.2:8443/healthz ...
	I1219 06:45:09.866484 2170338 api_server.go:279] https://192.168.67.2:8443/healthz returned 200:
	ok
	I1219 06:45:09.866521 2170338 status.go:463] multinode-613405 apiserver status = Running (err=<nil>)
	I1219 06:45:09.866532 2170338 status.go:176] multinode-613405 status: &{Name:multinode-613405 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I1219 06:45:09.866550 2170338 status.go:174] checking status of multinode-613405-m02 ...
	I1219 06:45:09.866866 2170338 cli_runner.go:164] Run: docker container inspect multinode-613405-m02 --format={{.State.Status}}
	I1219 06:45:09.884948 2170338 status.go:371] multinode-613405-m02 host status = "Running" (err=<nil>)
	I1219 06:45:09.884973 2170338 host.go:66] Checking if "multinode-613405-m02" exists ...
	I1219 06:45:09.885290 2170338 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" multinode-613405-m02
	I1219 06:45:09.903180 2170338 host.go:66] Checking if "multinode-613405-m02" exists ...
	I1219 06:45:09.903485 2170338 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1219 06:45:09.903535 2170338 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" multinode-613405-m02
	I1219 06:45:09.923960 2170338 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:34834 SSHKeyPath:/home/jenkins/minikube-integration/22230-1998525/.minikube/machines/multinode-613405-m02/id_rsa Username:docker}
	I1219 06:45:10.030870 2170338 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1219 06:45:10.046463 2170338 status.go:176] multinode-613405-m02 status: &{Name:multinode-613405-m02 Host:Running Kubelet:Running APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}
	I1219 06:45:10.046555 2170338 status.go:174] checking status of multinode-613405-m03 ...
	I1219 06:45:10.046890 2170338 cli_runner.go:164] Run: docker container inspect multinode-613405-m03 --format={{.State.Status}}
	I1219 06:45:10.066127 2170338 status.go:371] multinode-613405-m03 host status = "Stopped" (err=<nil>)
	I1219 06:45:10.066190 2170338 status.go:384] host is not running, skipping remaining checks
	I1219 06:45:10.066197 2170338 status.go:176] multinode-613405-m03 status: &{Name:multinode-613405-m03 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMultiNode/serial/StopNode (2.44s)

                                                
                                    
x
+
TestMultiNode/serial/StartAfterStop (7.89s)

                                                
                                                
=== RUN   TestMultiNode/serial/StartAfterStop
multinode_test.go:282: (dbg) Run:  out/minikube-linux-arm64 -p multinode-613405 node start m03 -v=5 --alsologtostderr
multinode_test.go:282: (dbg) Done: out/minikube-linux-arm64 -p multinode-613405 node start m03 -v=5 --alsologtostderr: (7.071942196s)
multinode_test.go:290: (dbg) Run:  out/minikube-linux-arm64 -p multinode-613405 status -v=5 --alsologtostderr
multinode_test.go:306: (dbg) Run:  kubectl get nodes
--- PASS: TestMultiNode/serial/StartAfterStop (7.89s)

                                                
                                    
x
+
TestMultiNode/serial/RestartKeepsNodes (79.16s)

                                                
                                                
=== RUN   TestMultiNode/serial/RestartKeepsNodes
multinode_test.go:314: (dbg) Run:  out/minikube-linux-arm64 node list -p multinode-613405
multinode_test.go:321: (dbg) Run:  out/minikube-linux-arm64 stop -p multinode-613405
E1219 06:45:27.486843 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/addons-064622/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
multinode_test.go:321: (dbg) Done: out/minikube-linux-arm64 stop -p multinode-613405: (25.143457744s)
multinode_test.go:326: (dbg) Run:  out/minikube-linux-arm64 start -p multinode-613405 --wait=true -v=5 --alsologtostderr
E1219 06:45:57.292706 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 06:46:29.406670 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-125117/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
multinode_test.go:326: (dbg) Done: out/minikube-linux-arm64 start -p multinode-613405 --wait=true -v=5 --alsologtostderr: (53.882786439s)
multinode_test.go:331: (dbg) Run:  out/minikube-linux-arm64 node list -p multinode-613405
--- PASS: TestMultiNode/serial/RestartKeepsNodes (79.16s)

                                                
                                    
x
+
TestMultiNode/serial/DeleteNode (5.73s)

                                                
                                                
=== RUN   TestMultiNode/serial/DeleteNode
multinode_test.go:416: (dbg) Run:  out/minikube-linux-arm64 -p multinode-613405 node delete m03
multinode_test.go:416: (dbg) Done: out/minikube-linux-arm64 -p multinode-613405 node delete m03: (5.017329399s)
multinode_test.go:422: (dbg) Run:  out/minikube-linux-arm64 -p multinode-613405 status --alsologtostderr
multinode_test.go:436: (dbg) Run:  kubectl get nodes
multinode_test.go:444: (dbg) Run:  kubectl get nodes -o "go-template='{{range .items}}{{range .status.conditions}}{{if eq .type "Ready"}} {{.status}}{{"\n"}}{{end}}{{end}}{{end}}'"
--- PASS: TestMultiNode/serial/DeleteNode (5.73s)

                                                
                                    
x
+
TestMultiNode/serial/StopMultiNode (24.09s)

                                                
                                                
=== RUN   TestMultiNode/serial/StopMultiNode
multinode_test.go:345: (dbg) Run:  out/minikube-linux-arm64 -p multinode-613405 stop
multinode_test.go:345: (dbg) Done: out/minikube-linux-arm64 -p multinode-613405 stop: (23.911631765s)
multinode_test.go:351: (dbg) Run:  out/minikube-linux-arm64 -p multinode-613405 status
multinode_test.go:351: (dbg) Non-zero exit: out/minikube-linux-arm64 -p multinode-613405 status: exit status 7 (95.443333ms)

                                                
                                                
-- stdout --
	multinode-613405
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	multinode-613405-m02
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
multinode_test.go:358: (dbg) Run:  out/minikube-linux-arm64 -p multinode-613405 status --alsologtostderr
multinode_test.go:358: (dbg) Non-zero exit: out/minikube-linux-arm64 -p multinode-613405 status --alsologtostderr: exit status 7 (85.821587ms)

                                                
                                                
-- stdout --
	multinode-613405
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	multinode-613405-m02
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1219 06:47:06.900042 2179125 out.go:360] Setting OutFile to fd 1 ...
	I1219 06:47:06.900166 2179125 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 06:47:06.900180 2179125 out.go:374] Setting ErrFile to fd 2...
	I1219 06:47:06.900186 2179125 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 06:47:06.900444 2179125 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22230-1998525/.minikube/bin
	I1219 06:47:06.900626 2179125 out.go:368] Setting JSON to false
	I1219 06:47:06.900667 2179125 mustload.go:66] Loading cluster: multinode-613405
	I1219 06:47:06.900741 2179125 notify.go:221] Checking for updates...
	I1219 06:47:06.901701 2179125 config.go:182] Loaded profile config "multinode-613405": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
	I1219 06:47:06.901731 2179125 status.go:174] checking status of multinode-613405 ...
	I1219 06:47:06.902259 2179125 cli_runner.go:164] Run: docker container inspect multinode-613405 --format={{.State.Status}}
	I1219 06:47:06.921015 2179125 status.go:371] multinode-613405 host status = "Stopped" (err=<nil>)
	I1219 06:47:06.921043 2179125 status.go:384] host is not running, skipping remaining checks
	I1219 06:47:06.921050 2179125 status.go:176] multinode-613405 status: &{Name:multinode-613405 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I1219 06:47:06.921085 2179125 status.go:174] checking status of multinode-613405-m02 ...
	I1219 06:47:06.921412 2179125 cli_runner.go:164] Run: docker container inspect multinode-613405-m02 --format={{.State.Status}}
	I1219 06:47:06.939168 2179125 status.go:371] multinode-613405-m02 host status = "Stopped" (err=<nil>)
	I1219 06:47:06.939194 2179125 status.go:384] host is not running, skipping remaining checks
	I1219 06:47:06.939201 2179125 status.go:176] multinode-613405-m02 status: &{Name:multinode-613405-m02 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMultiNode/serial/StopMultiNode (24.09s)

                                                
                                    
x
+
TestMultiNode/serial/RestartMultiNode (50.71s)

                                                
                                                
=== RUN   TestMultiNode/serial/RestartMultiNode
multinode_test.go:376: (dbg) Run:  out/minikube-linux-arm64 start -p multinode-613405 --wait=true -v=5 --alsologtostderr --driver=docker  --container-runtime=containerd
multinode_test.go:376: (dbg) Done: out/minikube-linux-arm64 start -p multinode-613405 --wait=true -v=5 --alsologtostderr --driver=docker  --container-runtime=containerd: (49.993524341s)
multinode_test.go:382: (dbg) Run:  out/minikube-linux-arm64 -p multinode-613405 status --alsologtostderr
multinode_test.go:396: (dbg) Run:  kubectl get nodes
multinode_test.go:404: (dbg) Run:  kubectl get nodes -o "go-template='{{range .items}}{{range .status.conditions}}{{if eq .type "Ready"}} {{.status}}{{"\n"}}{{end}}{{end}}{{end}}'"
--- PASS: TestMultiNode/serial/RestartMultiNode (50.71s)

                                                
                                    
x
+
TestMultiNode/serial/ValidateNameConflict (33.74s)

                                                
                                                
=== RUN   TestMultiNode/serial/ValidateNameConflict
multinode_test.go:455: (dbg) Run:  out/minikube-linux-arm64 node list -p multinode-613405
multinode_test.go:464: (dbg) Run:  out/minikube-linux-arm64 start -p multinode-613405-m02 --driver=docker  --container-runtime=containerd
multinode_test.go:464: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p multinode-613405-m02 --driver=docker  --container-runtime=containerd: exit status 14 (91.674072ms)

                                                
                                                
-- stdout --
	* [multinode-613405-m02] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=22230
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/22230-1998525/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/22230-1998525/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	! Profile name 'multinode-613405-m02' is duplicated with machine name 'multinode-613405-m02' in profile 'multinode-613405'
	X Exiting due to MK_USAGE: Profile name should be unique

                                                
                                                
** /stderr **
multinode_test.go:472: (dbg) Run:  out/minikube-linux-arm64 start -p multinode-613405-m03 --driver=docker  --container-runtime=containerd
multinode_test.go:472: (dbg) Done: out/minikube-linux-arm64 start -p multinode-613405-m03 --driver=docker  --container-runtime=containerd: (31.15165348s)
multinode_test.go:479: (dbg) Run:  out/minikube-linux-arm64 node add -p multinode-613405
multinode_test.go:479: (dbg) Non-zero exit: out/minikube-linux-arm64 node add -p multinode-613405: exit status 80 (360.620491ms)

                                                
                                                
-- stdout --
	* Adding node m03 to cluster multinode-613405 as [worker]
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to GUEST_NODE_ADD: failed to add node: Node multinode-613405-m03 already exists in multinode-613405-m03 profile
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_node_040ea7097fd6ed71e65be9a474587f81f0ccd21d_1.log                    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
multinode_test.go:484: (dbg) Run:  out/minikube-linux-arm64 delete -p multinode-613405-m03
multinode_test.go:484: (dbg) Done: out/minikube-linux-arm64 delete -p multinode-613405-m03: (2.074234183s)
--- PASS: TestMultiNode/serial/ValidateNameConflict (33.74s)

                                                
                                    
x
+
TestPreload (127.42s)

                                                
                                                
=== RUN   TestPreload
preload_test.go:41: (dbg) Run:  out/minikube-linux-arm64 start -p test-preload-553870 --memory=3072 --alsologtostderr --wait=true --preload=false --driver=docker  --container-runtime=containerd
E1219 06:49:34.247430 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
preload_test.go:41: (dbg) Done: out/minikube-linux-arm64 start -p test-preload-553870 --memory=3072 --alsologtostderr --wait=true --preload=false --driver=docker  --container-runtime=containerd: (1m3.85112588s)
preload_test.go:49: (dbg) Run:  out/minikube-linux-arm64 -p test-preload-553870 image pull gcr.io/k8s-minikube/busybox
preload_test.go:49: (dbg) Done: out/minikube-linux-arm64 -p test-preload-553870 image pull gcr.io/k8s-minikube/busybox: (2.194105381s)
preload_test.go:55: (dbg) Run:  out/minikube-linux-arm64 stop -p test-preload-553870
preload_test.go:55: (dbg) Done: out/minikube-linux-arm64 stop -p test-preload-553870: (5.946848255s)
preload_test.go:63: (dbg) Run:  out/minikube-linux-arm64 start -p test-preload-553870 --preload=true --alsologtostderr -v=1 --wait=true --driver=docker  --container-runtime=containerd
E1219 06:50:27.486021 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/addons-064622/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
preload_test.go:63: (dbg) Done: out/minikube-linux-arm64 start -p test-preload-553870 --preload=true --alsologtostderr -v=1 --wait=true --driver=docker  --container-runtime=containerd: (52.652804253s)
preload_test.go:68: (dbg) Run:  out/minikube-linux-arm64 -p test-preload-553870 image list
helpers_test.go:176: Cleaning up "test-preload-553870" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p test-preload-553870
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p test-preload-553870: (2.524530044s)
--- PASS: TestPreload (127.42s)

                                                
                                    
x
+
TestScheduledStopUnix (109.68s)

                                                
                                                
=== RUN   TestScheduledStopUnix
scheduled_stop_test.go:128: (dbg) Run:  out/minikube-linux-arm64 start -p scheduled-stop-541431 --memory=3072 --driver=docker  --container-runtime=containerd
E1219 06:51:12.457348 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-125117/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
scheduled_stop_test.go:128: (dbg) Done: out/minikube-linux-arm64 start -p scheduled-stop-541431 --memory=3072 --driver=docker  --container-runtime=containerd: (33.389549981s)
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-linux-arm64 stop -p scheduled-stop-541431 --schedule 5m -v=5 --alsologtostderr
minikube stop output:

                                                
                                                
** stderr ** 
	I1219 06:51:16.594752 2195018 out.go:360] Setting OutFile to fd 1 ...
	I1219 06:51:16.594881 2195018 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 06:51:16.594893 2195018 out.go:374] Setting ErrFile to fd 2...
	I1219 06:51:16.594899 2195018 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 06:51:16.595154 2195018 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22230-1998525/.minikube/bin
	I1219 06:51:16.595419 2195018 out.go:368] Setting JSON to false
	I1219 06:51:16.595533 2195018 mustload.go:66] Loading cluster: scheduled-stop-541431
	I1219 06:51:16.595872 2195018 config.go:182] Loaded profile config "scheduled-stop-541431": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
	I1219 06:51:16.595949 2195018 profile.go:143] Saving config to /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/scheduled-stop-541431/config.json ...
	I1219 06:51:16.596124 2195018 mustload.go:66] Loading cluster: scheduled-stop-541431
	I1219 06:51:16.596242 2195018 config.go:182] Loaded profile config "scheduled-stop-541431": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3

                                                
                                                
** /stderr **
scheduled_stop_test.go:204: (dbg) Run:  out/minikube-linux-arm64 status --format={{.TimeToStop}} -p scheduled-stop-541431 -n scheduled-stop-541431
scheduled_stop_test.go:172: signal error was:  <nil>
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-linux-arm64 stop -p scheduled-stop-541431 --schedule 15s -v=5 --alsologtostderr
minikube stop output:

                                                
                                                
** stderr ** 
	I1219 06:51:17.075307 2195109 out.go:360] Setting OutFile to fd 1 ...
	I1219 06:51:17.075444 2195109 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 06:51:17.075458 2195109 out.go:374] Setting ErrFile to fd 2...
	I1219 06:51:17.075465 2195109 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 06:51:17.076443 2195109 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22230-1998525/.minikube/bin
	I1219 06:51:17.076882 2195109 out.go:368] Setting JSON to false
	I1219 06:51:17.077190 2195109 daemonize_unix.go:73] killing process 2195040 as it is an old scheduled stop
	I1219 06:51:17.080929 2195109 mustload.go:66] Loading cluster: scheduled-stop-541431
	I1219 06:51:17.081609 2195109 config.go:182] Loaded profile config "scheduled-stop-541431": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
	I1219 06:51:17.081722 2195109 profile.go:143] Saving config to /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/scheduled-stop-541431/config.json ...
	I1219 06:51:17.081955 2195109 mustload.go:66] Loading cluster: scheduled-stop-541431
	I1219 06:51:17.082115 2195109 config.go:182] Loaded profile config "scheduled-stop-541431": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3

                                                
                                                
** /stderr **
scheduled_stop_test.go:172: signal error was:  os: process already finished
I1219 06:51:17.089121 2000386 retry.go:31] will retry after 67.685µs: open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/scheduled-stop-541431/pid: no such file or directory
I1219 06:51:17.090446 2000386 retry.go:31] will retry after 182.245µs: open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/scheduled-stop-541431/pid: no such file or directory
I1219 06:51:17.091593 2000386 retry.go:31] will retry after 275.947µs: open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/scheduled-stop-541431/pid: no such file or directory
I1219 06:51:17.092727 2000386 retry.go:31] will retry after 468.63µs: open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/scheduled-stop-541431/pid: no such file or directory
I1219 06:51:17.093849 2000386 retry.go:31] will retry after 394.429µs: open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/scheduled-stop-541431/pid: no such file or directory
I1219 06:51:17.094978 2000386 retry.go:31] will retry after 917.49µs: open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/scheduled-stop-541431/pid: no such file or directory
I1219 06:51:17.096052 2000386 retry.go:31] will retry after 1.500811ms: open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/scheduled-stop-541431/pid: no such file or directory
I1219 06:51:17.098213 2000386 retry.go:31] will retry after 1.717845ms: open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/scheduled-stop-541431/pid: no such file or directory
I1219 06:51:17.100361 2000386 retry.go:31] will retry after 3.450622ms: open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/scheduled-stop-541431/pid: no such file or directory
I1219 06:51:17.104560 2000386 retry.go:31] will retry after 2.101871ms: open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/scheduled-stop-541431/pid: no such file or directory
I1219 06:51:17.107767 2000386 retry.go:31] will retry after 6.574748ms: open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/scheduled-stop-541431/pid: no such file or directory
I1219 06:51:17.115043 2000386 retry.go:31] will retry after 4.389301ms: open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/scheduled-stop-541431/pid: no such file or directory
I1219 06:51:17.120223 2000386 retry.go:31] will retry after 12.217561ms: open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/scheduled-stop-541431/pid: no such file or directory
I1219 06:51:17.133456 2000386 retry.go:31] will retry after 14.854859ms: open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/scheduled-stop-541431/pid: no such file or directory
I1219 06:51:17.148692 2000386 retry.go:31] will retry after 43.590767ms: open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/scheduled-stop-541431/pid: no such file or directory
I1219 06:51:17.193130 2000386 retry.go:31] will retry after 56.790189ms: open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/scheduled-stop-541431/pid: no such file or directory
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-linux-arm64 stop -p scheduled-stop-541431 --cancel-scheduled
minikube stop output:

                                                
                                                
-- stdout --
	* All existing scheduled stops cancelled

                                                
                                                
-- /stdout --
E1219 06:51:29.406212 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-125117/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
scheduled_stop_test.go:189: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p scheduled-stop-541431 -n scheduled-stop-541431
scheduled_stop_test.go:218: (dbg) Run:  out/minikube-linux-arm64 status -p scheduled-stop-541431
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-linux-arm64 stop -p scheduled-stop-541431 --schedule 15s -v=5 --alsologtostderr
minikube stop output:

                                                
                                                
** stderr ** 
	I1219 06:51:43.071240 2195789 out.go:360] Setting OutFile to fd 1 ...
	I1219 06:51:43.071483 2195789 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 06:51:43.071514 2195789 out.go:374] Setting ErrFile to fd 2...
	I1219 06:51:43.071534 2195789 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1219 06:51:43.071806 2195789 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22230-1998525/.minikube/bin
	I1219 06:51:43.072111 2195789 out.go:368] Setting JSON to false
	I1219 06:51:43.072250 2195789 mustload.go:66] Loading cluster: scheduled-stop-541431
	I1219 06:51:43.072640 2195789 config.go:182] Loaded profile config "scheduled-stop-541431": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3
	I1219 06:51:43.072792 2195789 profile.go:143] Saving config to /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/scheduled-stop-541431/config.json ...
	I1219 06:51:43.073020 2195789 mustload.go:66] Loading cluster: scheduled-stop-541431
	I1219 06:51:43.073181 2195789 config.go:182] Loaded profile config "scheduled-stop-541431": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.3

                                                
                                                
** /stderr **
scheduled_stop_test.go:172: signal error was:  os: process already finished
scheduled_stop_test.go:218: (dbg) Run:  out/minikube-linux-arm64 status -p scheduled-stop-541431
scheduled_stop_test.go:218: (dbg) Non-zero exit: out/minikube-linux-arm64 status -p scheduled-stop-541431: exit status 7 (73.098376ms)

                                                
                                                
-- stdout --
	scheduled-stop-541431
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	

                                                
                                                
-- /stdout --
scheduled_stop_test.go:189: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p scheduled-stop-541431 -n scheduled-stop-541431
scheduled_stop_test.go:189: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p scheduled-stop-541431 -n scheduled-stop-541431: exit status 7 (66.66477ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
scheduled_stop_test.go:189: status error: exit status 7 (may be ok)
helpers_test.go:176: Cleaning up "scheduled-stop-541431" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p scheduled-stop-541431
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p scheduled-stop-541431: (4.606736652s)
--- PASS: TestScheduledStopUnix (109.68s)

                                                
                                    
x
+
TestInsufficientStorage (9.79s)

                                                
                                                
=== RUN   TestInsufficientStorage
status_test.go:50: (dbg) Run:  out/minikube-linux-arm64 start -p insufficient-storage-964275 --memory=3072 --output=json --wait=true --driver=docker  --container-runtime=containerd
status_test.go:50: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p insufficient-storage-964275 --memory=3072 --output=json --wait=true --driver=docker  --container-runtime=containerd: exit status 26 (7.149634709s)

                                                
                                                
-- stdout --
	{"specversion":"1.0","id":"0e047bef-11a7-4950-8e16-99980a02694b","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"0","message":"[insufficient-storage-964275] minikube v1.37.0 on Ubuntu 20.04 (arm64)","name":"Initial Minikube Setup","totalsteps":"19"}}
	{"specversion":"1.0","id":"80fed9d4-00f7-478a-8477-031085281a5d","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_LOCATION=22230"}}
	{"specversion":"1.0","id":"94b70fa1-e13c-4192-b1de-de3f84036eda","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true"}}
	{"specversion":"1.0","id":"61448838-82fa-4a05-b52d-a92839fd5cd4","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"KUBECONFIG=/home/jenkins/minikube-integration/22230-1998525/kubeconfig"}}
	{"specversion":"1.0","id":"d1bcf018-a480-4d76-b147-2e3f88fe4eda","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_HOME=/home/jenkins/minikube-integration/22230-1998525/.minikube"}}
	{"specversion":"1.0","id":"154aa981-d1da-4797-9090-fb9d829180e1","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_BIN=out/minikube-linux-arm64"}}
	{"specversion":"1.0","id":"3e6412f8-e2fc-4589-8f21-46dde01e4c9e","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_FORCE_SYSTEMD="}}
	{"specversion":"1.0","id":"8993339c-7bc5-4487-bdb7-d9878f9f73ae","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_TEST_STORAGE_CAPACITY=100"}}
	{"specversion":"1.0","id":"cb844d8a-eccd-4dd5-a2bc-2994923c6a69","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_TEST_AVAILABLE_STORAGE=19"}}
	{"specversion":"1.0","id":"78498072-db5d-444e-ba93-372c5f4e0e87","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"1","message":"Using the docker driver based on user configuration","name":"Selecting Driver","totalsteps":"19"}}
	{"specversion":"1.0","id":"be55c0c8-6855-4a02-965b-36f6f04eee12","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"Using Docker driver with root privileges"}}
	{"specversion":"1.0","id":"42eb72e2-4dde-4e3a-9121-61ed771eadfd","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"3","message":"Starting \"insufficient-storage-964275\" primary control-plane node in \"insufficient-storage-964275\" cluster","name":"Starting Node","totalsteps":"19"}}
	{"specversion":"1.0","id":"2a16c0fb-3fee-4fa1-b319-5bc5cf34ab99","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"5","message":"Pulling base image v0.0.48-1765966054-22186 ...","name":"Pulling Base Image","totalsteps":"19"}}
	{"specversion":"1.0","id":"8719cf5f-bee8-4e74-98ff-4b7a29987680","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"8","message":"Creating docker container (CPUs=2, Memory=3072MB) ...","name":"Creating Container","totalsteps":"19"}}
	{"specversion":"1.0","id":"2b46d473-2019-4471-81a8-7ee3fdadbf7e","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.error","datacontenttype":"application/json","data":{"advice":"Try one or more of the following to free up space on the device:\n\n\t\t\t1. Run \"docker system prune\" to remove unused Docker data (optionally with \"-a\")\n\t\t\t2. Increase the storage allocated to Docker for Desktop by clicking on:\n\t\t\t\tDocker icon \u003e Preferences \u003e Resources \u003e Disk Image Size\n\t\t\t3. Run \"minikube ssh -- docker system prune\" if using the Docker container runtime","exitcode":"26","issues":"https://github.com/kubernetes/minikube/issues/9024","message":"Docker is out of disk space! (/var is at 100% of capacity). You can pass '--force' to skip this check.","name":"RSRC_DOCKER_STORAGE","url":""}}

                                                
                                                
-- /stdout --
status_test.go:76: (dbg) Run:  out/minikube-linux-arm64 status -p insufficient-storage-964275 --output=json --layout=cluster
status_test.go:76: (dbg) Non-zero exit: out/minikube-linux-arm64 status -p insufficient-storage-964275 --output=json --layout=cluster: exit status 7 (356.926533ms)

                                                
                                                
-- stdout --
	{"Name":"insufficient-storage-964275","StatusCode":507,"StatusName":"InsufficientStorage","StatusDetail":"/var is almost out of disk space","Step":"Creating Container","StepDetail":"Creating docker container (CPUs=2, Memory=3072MB) ...","BinaryVersion":"v1.37.0","Components":{"kubeconfig":{"Name":"kubeconfig","StatusCode":500,"StatusName":"Error"}},"Nodes":[{"Name":"insufficient-storage-964275","StatusCode":507,"StatusName":"InsufficientStorage","Components":{"apiserver":{"Name":"apiserver","StatusCode":405,"StatusName":"Stopped"},"kubelet":{"Name":"kubelet","StatusCode":405,"StatusName":"Stopped"}}}]}

                                                
                                                
-- /stdout --
** stderr ** 
	E1219 06:52:40.315483 2197623 status.go:458] kubeconfig endpoint: get endpoint: "insufficient-storage-964275" does not appear in /home/jenkins/minikube-integration/22230-1998525/kubeconfig

                                                
                                                
** /stderr **
status_test.go:76: (dbg) Run:  out/minikube-linux-arm64 status -p insufficient-storage-964275 --output=json --layout=cluster
status_test.go:76: (dbg) Non-zero exit: out/minikube-linux-arm64 status -p insufficient-storage-964275 --output=json --layout=cluster: exit status 7 (302.758387ms)

                                                
                                                
-- stdout --
	{"Name":"insufficient-storage-964275","StatusCode":507,"StatusName":"InsufficientStorage","StatusDetail":"/var is almost out of disk space","BinaryVersion":"v1.37.0","Components":{"kubeconfig":{"Name":"kubeconfig","StatusCode":500,"StatusName":"Error"}},"Nodes":[{"Name":"insufficient-storage-964275","StatusCode":507,"StatusName":"InsufficientStorage","Components":{"apiserver":{"Name":"apiserver","StatusCode":405,"StatusName":"Stopped"},"kubelet":{"Name":"kubelet","StatusCode":405,"StatusName":"Stopped"}}}]}

                                                
                                                
-- /stdout --
** stderr ** 
	E1219 06:52:40.620611 2197688 status.go:458] kubeconfig endpoint: get endpoint: "insufficient-storage-964275" does not appear in /home/jenkins/minikube-integration/22230-1998525/kubeconfig
	E1219 06:52:40.630413 2197688 status.go:258] unable to read event log: stat: stat /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/insufficient-storage-964275/events.json: no such file or directory

                                                
                                                
** /stderr **
helpers_test.go:176: Cleaning up "insufficient-storage-964275" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p insufficient-storage-964275
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p insufficient-storage-964275: (1.98461531s)
--- PASS: TestInsufficientStorage (9.79s)

                                                
                                    
x
+
TestRunningBinaryUpgrade (311.31s)

                                                
                                                
=== RUN   TestRunningBinaryUpgrade
=== PAUSE TestRunningBinaryUpgrade

                                                
                                                

                                                
                                                
=== CONT  TestRunningBinaryUpgrade
version_upgrade_test.go:120: (dbg) Run:  /tmp/minikube-v1.35.0.2205758069 start -p running-upgrade-442211 --memory=3072 --vm-driver=docker  --container-runtime=containerd
E1219 07:00:27.486176 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/addons-064622/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
version_upgrade_test.go:120: (dbg) Done: /tmp/minikube-v1.35.0.2205758069 start -p running-upgrade-442211 --memory=3072 --vm-driver=docker  --container-runtime=containerd: (29.596025757s)
version_upgrade_test.go:130: (dbg) Run:  out/minikube-linux-arm64 start -p running-upgrade-442211 --memory=3072 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd
E1219 07:01:29.406245 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-125117/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 07:02:37.293479 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 07:04:34.247449 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
version_upgrade_test.go:130: (dbg) Done: out/minikube-linux-arm64 start -p running-upgrade-442211 --memory=3072 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd: (4m38.899792695s)
helpers_test.go:176: Cleaning up "running-upgrade-442211" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p running-upgrade-442211
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p running-upgrade-442211: (1.975131196s)
--- PASS: TestRunningBinaryUpgrade (311.31s)

                                                
                                    
x
+
TestMissingContainerUpgrade (134.91s)

                                                
                                                
=== RUN   TestMissingContainerUpgrade
=== PAUSE TestMissingContainerUpgrade

                                                
                                                

                                                
                                                
=== CONT  TestMissingContainerUpgrade
version_upgrade_test.go:309: (dbg) Run:  /tmp/minikube-v1.35.0.2775293256 start -p missing-upgrade-933186 --memory=3072 --driver=docker  --container-runtime=containerd
version_upgrade_test.go:309: (dbg) Done: /tmp/minikube-v1.35.0.2775293256 start -p missing-upgrade-933186 --memory=3072 --driver=docker  --container-runtime=containerd: (1m3.085827248s)
version_upgrade_test.go:318: (dbg) Run:  docker stop missing-upgrade-933186
version_upgrade_test.go:323: (dbg) Run:  docker rm missing-upgrade-933186
version_upgrade_test.go:329: (dbg) Run:  out/minikube-linux-arm64 start -p missing-upgrade-933186 --memory=3072 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd
version_upgrade_test.go:329: (dbg) Done: out/minikube-linux-arm64 start -p missing-upgrade-933186 --memory=3072 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd: (1m8.016355343s)
helpers_test.go:176: Cleaning up "missing-upgrade-933186" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p missing-upgrade-933186
helpers_test.go:179: (dbg) Done: out/minikube-linux-arm64 delete -p missing-upgrade-933186: (2.165635734s)
--- PASS: TestMissingContainerUpgrade (134.91s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartNoK8sWithVersion (0.1s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartNoK8sWithVersion
no_kubernetes_test.go:108: (dbg) Run:  out/minikube-linux-arm64 start -p NoKubernetes-410934 --no-kubernetes --kubernetes-version=v1.28.0 --driver=docker  --container-runtime=containerd
no_kubernetes_test.go:108: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p NoKubernetes-410934 --no-kubernetes --kubernetes-version=v1.28.0 --driver=docker  --container-runtime=containerd: exit status 14 (102.413377ms)

                                                
                                                
-- stdout --
	* [NoKubernetes-410934] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=22230
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/22230-1998525/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/22230-1998525/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to MK_USAGE: cannot specify --kubernetes-version with --no-kubernetes,
	to unset a global config run:
	
	$ minikube config unset kubernetes-version

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/StartNoK8sWithVersion (0.10s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartWithK8s (47.08s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartWithK8s
no_kubernetes_test.go:120: (dbg) Run:  out/minikube-linux-arm64 start -p NoKubernetes-410934 --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd
no_kubernetes_test.go:120: (dbg) Done: out/minikube-linux-arm64 start -p NoKubernetes-410934 --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd: (46.450582263s)
no_kubernetes_test.go:225: (dbg) Run:  out/minikube-linux-arm64 -p NoKubernetes-410934 status -o json
--- PASS: TestNoKubernetes/serial/StartWithK8s (47.08s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartWithStopK8s (24.41s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartWithStopK8s
no_kubernetes_test.go:137: (dbg) Run:  out/minikube-linux-arm64 start -p NoKubernetes-410934 --no-kubernetes --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd
no_kubernetes_test.go:137: (dbg) Done: out/minikube-linux-arm64 start -p NoKubernetes-410934 --no-kubernetes --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd: (22.07822905s)
no_kubernetes_test.go:225: (dbg) Run:  out/minikube-linux-arm64 -p NoKubernetes-410934 status -o json
no_kubernetes_test.go:225: (dbg) Non-zero exit: out/minikube-linux-arm64 -p NoKubernetes-410934 status -o json: exit status 2 (314.069308ms)

                                                
                                                
-- stdout --
	{"Name":"NoKubernetes-410934","Host":"Running","Kubelet":"Stopped","APIServer":"Stopped","Kubeconfig":"Configured","Worker":false}

                                                
                                                
-- /stdout --
no_kubernetes_test.go:149: (dbg) Run:  out/minikube-linux-arm64 delete -p NoKubernetes-410934
no_kubernetes_test.go:149: (dbg) Done: out/minikube-linux-arm64 delete -p NoKubernetes-410934: (2.015166365s)
--- PASS: TestNoKubernetes/serial/StartWithStopK8s (24.41s)

                                                
                                    
x
+
TestNoKubernetes/serial/Start (8.12s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/Start
no_kubernetes_test.go:161: (dbg) Run:  out/minikube-linux-arm64 start -p NoKubernetes-410934 --no-kubernetes --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd
no_kubernetes_test.go:161: (dbg) Done: out/minikube-linux-arm64 start -p NoKubernetes-410934 --no-kubernetes --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd: (8.110157998s)
--- PASS: TestNoKubernetes/serial/Start (8.12s)

                                                
                                    
x
+
TestNoKubernetes/serial/VerifyNok8sNoK8sDownloads (0s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/VerifyNok8sNoK8sDownloads
no_kubernetes_test.go:89: Checking cache directory: /home/jenkins/minikube-integration/22230-1998525/.minikube/cache/linux/arm64/v0.0.0
--- PASS: TestNoKubernetes/serial/VerifyNok8sNoK8sDownloads (0.00s)

                                                
                                    
x
+
TestNoKubernetes/serial/VerifyK8sNotRunning (0.29s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/VerifyK8sNotRunning
no_kubernetes_test.go:172: (dbg) Run:  out/minikube-linux-arm64 ssh -p NoKubernetes-410934 "sudo systemctl is-active --quiet service kubelet"
no_kubernetes_test.go:172: (dbg) Non-zero exit: out/minikube-linux-arm64 ssh -p NoKubernetes-410934 "sudo systemctl is-active --quiet service kubelet": exit status 1 (286.383132ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/VerifyK8sNotRunning (0.29s)

                                                
                                    
x
+
TestNoKubernetes/serial/ProfileList (0.75s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/ProfileList
no_kubernetes_test.go:194: (dbg) Run:  out/minikube-linux-arm64 profile list
no_kubernetes_test.go:204: (dbg) Run:  out/minikube-linux-arm64 profile list --output=json
--- PASS: TestNoKubernetes/serial/ProfileList (0.75s)

                                                
                                    
x
+
TestNoKubernetes/serial/Stop (1.34s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/Stop
no_kubernetes_test.go:183: (dbg) Run:  out/minikube-linux-arm64 stop -p NoKubernetes-410934
no_kubernetes_test.go:183: (dbg) Done: out/minikube-linux-arm64 stop -p NoKubernetes-410934: (1.338797367s)
--- PASS: TestNoKubernetes/serial/Stop (1.34s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartNoArgs (7.47s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartNoArgs
no_kubernetes_test.go:216: (dbg) Run:  out/minikube-linux-arm64 start -p NoKubernetes-410934 --driver=docker  --container-runtime=containerd
no_kubernetes_test.go:216: (dbg) Done: out/minikube-linux-arm64 start -p NoKubernetes-410934 --driver=docker  --container-runtime=containerd: (7.473159395s)
--- PASS: TestNoKubernetes/serial/StartNoArgs (7.47s)

                                                
                                    
x
+
TestNoKubernetes/serial/VerifyK8sNotRunningSecond (0.27s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/VerifyK8sNotRunningSecond
no_kubernetes_test.go:172: (dbg) Run:  out/minikube-linux-arm64 ssh -p NoKubernetes-410934 "sudo systemctl is-active --quiet service kubelet"
no_kubernetes_test.go:172: (dbg) Non-zero exit: out/minikube-linux-arm64 ssh -p NoKubernetes-410934 "sudo systemctl is-active --quiet service kubelet": exit status 1 (270.026819ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/VerifyK8sNotRunningSecond (0.27s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/Setup (0.89s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/Setup
--- PASS: TestStoppedBinaryUpgrade/Setup (0.89s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/Upgrade (304.67s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/Upgrade
version_upgrade_test.go:183: (dbg) Run:  /tmp/minikube-v1.35.0.4215084662 start -p stopped-upgrade-057551 --memory=3072 --vm-driver=docker  --container-runtime=containerd
E1219 06:55:27.485884 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/addons-064622/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
version_upgrade_test.go:183: (dbg) Done: /tmp/minikube-v1.35.0.4215084662 start -p stopped-upgrade-057551 --memory=3072 --vm-driver=docker  --container-runtime=containerd: (36.375596639s)
version_upgrade_test.go:192: (dbg) Run:  /tmp/minikube-v1.35.0.4215084662 -p stopped-upgrade-057551 stop
version_upgrade_test.go:192: (dbg) Done: /tmp/minikube-v1.35.0.4215084662 -p stopped-upgrade-057551 stop: (1.224665041s)
version_upgrade_test.go:198: (dbg) Run:  out/minikube-linux-arm64 start -p stopped-upgrade-057551 --memory=3072 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd
E1219 06:56:29.406607 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-125117/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 06:56:50.535159 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/addons-064622/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1219 06:59:34.247274 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/functional-006924/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
version_upgrade_test.go:198: (dbg) Done: out/minikube-linux-arm64 start -p stopped-upgrade-057551 --memory=3072 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd: (4m27.070796614s)
--- PASS: TestStoppedBinaryUpgrade/Upgrade (304.67s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/MinikubeLogs (2.64s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/MinikubeLogs
version_upgrade_test.go:206: (dbg) Run:  out/minikube-linux-arm64 logs -p stopped-upgrade-057551
version_upgrade_test.go:206: (dbg) Done: out/minikube-linux-arm64 logs -p stopped-upgrade-057551: (2.637670325s)
--- PASS: TestStoppedBinaryUpgrade/MinikubeLogs (2.64s)

                                                
                                    
x
+
TestPause/serial/Start (52.51s)

                                                
                                                
=== RUN   TestPause/serial/Start
pause_test.go:80: (dbg) Run:  out/minikube-linux-arm64 start -p pause-962000 --memory=3072 --install-addons=false --wait=all --driver=docker  --container-runtime=containerd
E1219 07:05:27.486126 2000386 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22230-1998525/.minikube/profiles/addons-064622/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
pause_test.go:80: (dbg) Done: out/minikube-linux-arm64 start -p pause-962000 --memory=3072 --install-addons=false --wait=all --driver=docker  --container-runtime=containerd: (52.508734296s)
--- PASS: TestPause/serial/Start (52.51s)

                                                
                                    
x
+
TestPause/serial/SecondStartNoReconfiguration (6.24s)

                                                
                                                
=== RUN   TestPause/serial/SecondStartNoReconfiguration
pause_test.go:92: (dbg) Run:  out/minikube-linux-arm64 start -p pause-962000 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd
pause_test.go:92: (dbg) Done: out/minikube-linux-arm64 start -p pause-962000 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd: (6.224591263s)
--- PASS: TestPause/serial/SecondStartNoReconfiguration (6.24s)

                                                
                                    
x
+
TestPause/serial/Pause (0.73s)

                                                
                                                
=== RUN   TestPause/serial/Pause
pause_test.go:110: (dbg) Run:  out/minikube-linux-arm64 pause -p pause-962000 --alsologtostderr -v=5
--- PASS: TestPause/serial/Pause (0.73s)

                                                
                                    
x
+
TestPause/serial/VerifyStatus (0.33s)

                                                
                                                
=== RUN   TestPause/serial/VerifyStatus
status_test.go:76: (dbg) Run:  out/minikube-linux-arm64 status -p pause-962000 --output=json --layout=cluster
status_test.go:76: (dbg) Non-zero exit: out/minikube-linux-arm64 status -p pause-962000 --output=json --layout=cluster: exit status 2 (326.494482ms)

                                                
                                                
-- stdout --
	{"Name":"pause-962000","StatusCode":418,"StatusName":"Paused","Step":"Done","StepDetail":"* Paused 7 containers in: kube-system, kubernetes-dashboard, istio-operator","BinaryVersion":"v1.37.0","Components":{"kubeconfig":{"Name":"kubeconfig","StatusCode":200,"StatusName":"OK"}},"Nodes":[{"Name":"pause-962000","StatusCode":200,"StatusName":"OK","Components":{"apiserver":{"Name":"apiserver","StatusCode":418,"StatusName":"Paused"},"kubelet":{"Name":"kubelet","StatusCode":405,"StatusName":"Stopped"}}}]}

                                                
                                                
-- /stdout --
--- PASS: TestPause/serial/VerifyStatus (0.33s)

                                                
                                    
x
+
TestPause/serial/Unpause (0.64s)

                                                
                                                
=== RUN   TestPause/serial/Unpause
pause_test.go:121: (dbg) Run:  out/minikube-linux-arm64 unpause -p pause-962000 --alsologtostderr -v=5
--- PASS: TestPause/serial/Unpause (0.64s)

                                                
                                    
x
+
TestPause/serial/PauseAgain (0.89s)

                                                
                                                
=== RUN   TestPause/serial/PauseAgain
pause_test.go:110: (dbg) Run:  out/minikube-linux-arm64 pause -p pause-962000 --alsologtostderr -v=5
--- PASS: TestPause/serial/PauseAgain (0.89s)

                                                
                                    
x
+
TestPause/serial/DeletePaused (2.79s)

                                                
                                                
=== RUN   TestPause/serial/DeletePaused
pause_test.go:132: (dbg) Run:  out/minikube-linux-arm64 delete -p pause-962000 --alsologtostderr -v=5
pause_test.go:132: (dbg) Done: out/minikube-linux-arm64 delete -p pause-962000 --alsologtostderr -v=5: (2.792778681s)
--- PASS: TestPause/serial/DeletePaused (2.79s)

                                                
                                    
x
+
TestPause/serial/VerifyDeletedResources (0.42s)

                                                
                                                
=== RUN   TestPause/serial/VerifyDeletedResources
pause_test.go:142: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
pause_test.go:168: (dbg) Run:  docker ps -a
pause_test.go:173: (dbg) Run:  docker volume inspect pause-962000
pause_test.go:173: (dbg) Non-zero exit: docker volume inspect pause-962000: exit status 1 (34.206339ms)

                                                
                                                
-- stdout --
	[]

                                                
                                                
-- /stdout --
** stderr ** 
	Error response from daemon: get pause-962000: no such volume

                                                
                                                
** /stderr **
pause_test.go:178: (dbg) Run:  docker network ls
--- PASS: TestPause/serial/VerifyDeletedResources (0.42s)

                                                
                                    

Test skip (35/321)

Order skiped test Duration
5 TestDownloadOnly/v1.28.0/cached-images 0
6 TestDownloadOnly/v1.28.0/binaries 0
7 TestDownloadOnly/v1.28.0/kubectl 0
14 TestDownloadOnly/v1.34.3/cached-images 0
15 TestDownloadOnly/v1.34.3/binaries 0
16 TestDownloadOnly/v1.34.3/kubectl 0
23 TestDownloadOnly/v1.35.0-rc.1/cached-images 0
24 TestDownloadOnly/v1.35.0-rc.1/binaries 0
25 TestDownloadOnly/v1.35.0-rc.1/kubectl 0
29 TestDownloadOnlyKic 0.44
31 TestOffline 0
42 TestAddons/serial/GCPAuth/RealCredentials 0
49 TestAddons/parallel/Olm 0
56 TestAddons/parallel/AmdGpuDevicePlugin 0
60 TestDockerFlags 0
64 TestHyperKitDriverInstallOrUpdate 0
65 TestHyperkitDriverSkipUpgrade 0
112 TestFunctional/parallel/MySQL 0
116 TestFunctional/parallel/DockerEnv 0
117 TestFunctional/parallel/PodmanEnv 0
130 TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig 0
131 TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil 0
132 TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS 0
207 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MySQL 0
211 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DockerEnv 0
212 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PodmanEnv 0
224 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/DNSResolutionByDig 0
225 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil 0
226 TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/AccessThroughDNS 0
261 TestGvisorAddon 0
283 TestImageBuild 0
284 TestISOImage 0
348 TestChangeNoneUser 0
351 TestScheduledStopWindows 0
353 TestSkaffold 0
x
+
TestDownloadOnly/v1.28.0/cached-images (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.0/cached-images
aaa_download_only_test.go:128: Preload exists, images won't be cached
--- SKIP: TestDownloadOnly/v1.28.0/cached-images (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.0/binaries (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.0/binaries
aaa_download_only_test.go:150: Preload exists, binaries are present within.
--- SKIP: TestDownloadOnly/v1.28.0/binaries (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.0/kubectl (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.0/kubectl
aaa_download_only_test.go:166: Test for darwin and windows
--- SKIP: TestDownloadOnly/v1.28.0/kubectl (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.34.3/cached-images (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.34.3/cached-images
aaa_download_only_test.go:128: Preload exists, images won't be cached
--- SKIP: TestDownloadOnly/v1.34.3/cached-images (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.34.3/binaries (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.34.3/binaries
aaa_download_only_test.go:150: Preload exists, binaries are present within.
--- SKIP: TestDownloadOnly/v1.34.3/binaries (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.34.3/kubectl (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.34.3/kubectl
aaa_download_only_test.go:166: Test for darwin and windows
--- SKIP: TestDownloadOnly/v1.34.3/kubectl (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.35.0-rc.1/cached-images (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.35.0-rc.1/cached-images
aaa_download_only_test.go:128: Preload exists, images won't be cached
--- SKIP: TestDownloadOnly/v1.35.0-rc.1/cached-images (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.35.0-rc.1/binaries (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.35.0-rc.1/binaries
aaa_download_only_test.go:150: Preload exists, binaries are present within.
--- SKIP: TestDownloadOnly/v1.35.0-rc.1/binaries (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.35.0-rc.1/kubectl (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.35.0-rc.1/kubectl
aaa_download_only_test.go:166: Test for darwin and windows
--- SKIP: TestDownloadOnly/v1.35.0-rc.1/kubectl (0.00s)

                                                
                                    
x
+
TestDownloadOnlyKic (0.44s)

                                                
                                                
=== RUN   TestDownloadOnlyKic
aaa_download_only_test.go:231: (dbg) Run:  out/minikube-linux-arm64 start --download-only -p download-docker-589926 --alsologtostderr --driver=docker  --container-runtime=containerd
aaa_download_only_test.go:248: Skip for arm64 platform. See https://github.com/kubernetes/minikube/issues/10144
helpers_test.go:176: Cleaning up "download-docker-589926" profile ...
helpers_test.go:179: (dbg) Run:  out/minikube-linux-arm64 delete -p download-docker-589926
--- SKIP: TestDownloadOnlyKic (0.44s)

                                                
                                    
x
+
TestOffline (0s)

                                                
                                                
=== RUN   TestOffline
=== PAUSE TestOffline

                                                
                                                

                                                
                                                
=== CONT  TestOffline
aab_offline_test.go:35: skipping TestOffline - only docker runtime supported on arm64. See https://github.com/kubernetes/minikube/issues/10144
--- SKIP: TestOffline (0.00s)

                                                
                                    
x
+
TestAddons/serial/GCPAuth/RealCredentials (0s)

                                                
                                                
=== RUN   TestAddons/serial/GCPAuth/RealCredentials
addons_test.go:761: This test requires a GCE instance (excluding Cloud Shell) with a container based driver
--- SKIP: TestAddons/serial/GCPAuth/RealCredentials (0.00s)

                                                
                                    
x
+
TestAddons/parallel/Olm (0s)

                                                
                                                
=== RUN   TestAddons/parallel/Olm
=== PAUSE TestAddons/parallel/Olm

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Olm
addons_test.go:485: Skipping OLM addon test until https://github.com/operator-framework/operator-lifecycle-manager/issues/2534 is resolved
--- SKIP: TestAddons/parallel/Olm (0.00s)

                                                
                                    
x
+
TestAddons/parallel/AmdGpuDevicePlugin (0s)

                                                
                                                
=== RUN   TestAddons/parallel/AmdGpuDevicePlugin
=== PAUSE TestAddons/parallel/AmdGpuDevicePlugin

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/AmdGpuDevicePlugin
addons_test.go:1035: skip amd gpu test on all but docker driver and amd64 platform
--- SKIP: TestAddons/parallel/AmdGpuDevicePlugin (0.00s)

                                                
                                    
x
+
TestDockerFlags (0s)

                                                
                                                
=== RUN   TestDockerFlags
docker_test.go:41: skipping: only runs with docker container runtime, currently testing containerd
--- SKIP: TestDockerFlags (0.00s)

                                                
                                    
x
+
TestHyperKitDriverInstallOrUpdate (0s)

                                                
                                                
=== RUN   TestHyperKitDriverInstallOrUpdate
driver_install_or_update_test.go:37: Skip if not darwin.
--- SKIP: TestHyperKitDriverInstallOrUpdate (0.00s)

                                                
                                    
x
+
TestHyperkitDriverSkipUpgrade (0s)

                                                
                                                
=== RUN   TestHyperkitDriverSkipUpgrade
driver_install_or_update_test.go:101: Skip if not darwin.
--- SKIP: TestHyperkitDriverSkipUpgrade (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/MySQL (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/MySQL
=== PAUSE TestFunctional/parallel/MySQL

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/MySQL
functional_test.go:1792: arm64 is not supported by mysql. Skip the test. See https://github.com/kubernetes/minikube/issues/10144
--- SKIP: TestFunctional/parallel/MySQL (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/DockerEnv (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/DockerEnv
=== PAUSE TestFunctional/parallel/DockerEnv

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/DockerEnv
functional_test.go:478: only validate docker env with docker container runtime, currently testing containerd
--- SKIP: TestFunctional/parallel/DockerEnv (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/PodmanEnv (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/PodmanEnv
=== PAUSE TestFunctional/parallel/PodmanEnv

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/PodmanEnv
functional_test.go:565: only validate podman env with docker container runtime, currently testing containerd
--- SKIP: TestFunctional/parallel/PodmanEnv (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig
functional_test_tunnel_test.go:99: DNS forwarding is only supported for Hyperkit on Darwin, skipping test DNS forwarding
--- SKIP: TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil
functional_test_tunnel_test.go:99: DNS forwarding is only supported for Hyperkit on Darwin, skipping test DNS forwarding
--- SKIP: TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS
functional_test_tunnel_test.go:99: DNS forwarding is only supported for Hyperkit on Darwin, skipping test DNS forwarding
--- SKIP: TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MySQL (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MySQL
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MySQL

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MySQL
functional_test.go:1792: arm64 is not supported by mysql. Skip the test. See https://github.com/kubernetes/minikube/issues/10144
--- SKIP: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/MySQL (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DockerEnv (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DockerEnv
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DockerEnv

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DockerEnv
functional_test.go:478: only validate docker env with docker container runtime, currently testing containerd
--- SKIP: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/DockerEnv (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PodmanEnv (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PodmanEnv
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PodmanEnv

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PodmanEnv
functional_test.go:565: only validate podman env with docker container runtime, currently testing containerd
--- SKIP: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/PodmanEnv (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/DNSResolutionByDig (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/DNSResolutionByDig
functional_test_tunnel_test.go:99: DNS forwarding is only supported for Hyperkit on Darwin, skipping test DNS forwarding
--- SKIP: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/DNSResolutionByDig (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil
functional_test_tunnel_test.go:99: DNS forwarding is only supported for Hyperkit on Darwin, skipping test DNS forwarding
--- SKIP: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/AccessThroughDNS (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/AccessThroughDNS
functional_test_tunnel_test.go:99: DNS forwarding is only supported for Hyperkit on Darwin, skipping test DNS forwarding
--- SKIP: TestFunctionalNewestKubernetes/Versionv1.35.0-rc.1/parallel/TunnelCmd/serial/AccessThroughDNS (0.00s)

                                                
                                    
x
+
TestGvisorAddon (0s)

                                                
                                                
=== RUN   TestGvisorAddon
gvisor_addon_test.go:34: skipping test because --gvisor=false
--- SKIP: TestGvisorAddon (0.00s)

                                                
                                    
x
+
TestImageBuild (0s)

                                                
                                                
=== RUN   TestImageBuild
image_test.go:33: 
--- SKIP: TestImageBuild (0.00s)

                                                
                                    
x
+
TestISOImage (0s)

                                                
                                                
=== RUN   TestISOImage
iso_test.go:36: This test requires a VM driver
--- SKIP: TestISOImage (0.00s)

                                                
                                    
x
+
TestChangeNoneUser (0s)

                                                
                                                
=== RUN   TestChangeNoneUser
none_test.go:38: Test requires none driver and SUDO_USER env to not be empty
--- SKIP: TestChangeNoneUser (0.00s)

                                                
                                    
x
+
TestScheduledStopWindows (0s)

                                                
                                                
=== RUN   TestScheduledStopWindows
scheduled_stop_test.go:42: test only runs on windows
--- SKIP: TestScheduledStopWindows (0.00s)

                                                
                                    
x
+
TestSkaffold (0s)

                                                
                                                
=== RUN   TestSkaffold
skaffold_test.go:45: skaffold requires docker-env, currently testing containerd container runtime
--- SKIP: TestSkaffold (0.00s)

                                                
                                    
Copied to clipboard